• Skip to Content
  • Skip to Main Navigation
  • Skip to Search

NSSE logo

NSSENSSENSSE

Open Search
  • NSSE
    • About NSSE
      • Conceptual Framework
      • Positions and Policies
      • Advisors
      • Partners
      • Employment
    • Survey Instruments
      • Topical Modules
      • Engagement Indicators
      • High-Impact Practices
    • Registration Details
      • Pricing
      • Campus Contacts
      • Administration Checklist
    • Administering NSSE
      • Population File
      • Customizing NSSE
      • Recruitment Method
        • Folder Name
      • Encouraging Participation
      • Terms of Participation
      • Accessibility
      • Data Security
    • Reports & Data
      • NSSE Administration Overview
      • Data Files
      • Sample Report
      • Institutional Report Guide
      • Report Builder
      • Data Summaries & Interactive Displays
    • Working with NSSE Data
      • Data Codebooks
      • Syntax
    • Psychometric Portfolio
      • Response Rates
    • NSSE Shorts (New)
      • Pricing and Registration
      • Administration Instructions
      • Item Sets
      • Reports & Data Use
      • Dashboard LogIn
  • FSSE
    • FSSE Portal Log-in
    • About FSSE
    • Survey Instruments
      • Main Survey
        • FSSE Scales
        • Disciplinary Areas
      • Topical Modules
      • Consortium Questions
    • Registration & Pricing
    • Administering FSSE
      • Administration Overview
      • Confidentiality
      • Customization
        • Preparing for Message Delivery
        • Population File Instructions
      • Data Security
      • IRB Protocol
        • Informed Consent
      • Locating Your Data & Results
      • Sample Reports
        • Administration Summary
        • FSSE Respondent Profile
        • FSSE Topical Module Report
        • FSSE Disciplinary Area Report
        • FSSE-NSSE Combined Report
        • FSSE Frequency Report
        • FSSE Snapshot Report
      • Terms of Participation
    • Findings, Data, & Reports
      • FSSE Overview
      • Content Summaries
      • Data Visualizations
      • Data Use Examples
    • Working with FSSE data
      • Using FSSE Data
      • Analysis Resources
      • Data User's Guide
    • Psychometric Portfolio
  • BCSSE
    • About BCSSE
    • BCSSE Survey
      • BCSSE Scales
    • Fall Check-In
    • Registration & Pricing
    • Administering BCSSE
      • Administration Protocol and Procedures
      • BCSSE Contacts
      • Demonstration Dashboard Portal
      • Institution Participation Agreement
      • IRB
      • Data Security and Accessibility
    • Reports & Data
      • BCSSE Overview
      • Accessing BCSSE Data
      • Summary Tables
    • Working with BCSSE data
      • Additional Resources
    • Dashboard Log-in
  • Support & Resources
    • For Participating Institutions
      • How Institutions Use Their Data
        • Lessons from the Field
          • Institution Examples
        • NSSE Data Use in Brief
        • Search for examples
        • Displaying Results
        • Data Use Teams
      • Data & Results Guides
        • Student Success Mapping
        • Navigating Your Institutional Report
        • Tips for More Inclusive Data Sharing and Analysis
        • Data User's Guide: Sense of Belonging
        • Accreditation Toolkits
        • Sharing and Disseminating NSSE Results
        • NSSE Data User’s Guide
        • Campuswide Mapping
        • Contextualizing NSSE Effect Sizes
        • Custom Analysis
      • Workshops and Webinars
    • For Partnerships
      • Special Projects
    • For Students & Parents
      • Pocket Guide
        • English
        • Spanish
    • For All Audiences
  • Research
    • Annual Results
      • Annual Results 2023
        • Special Report 1
        • Special Report 2
      • Annual Results 2022
        • 1. Rebounding Engagement
        • 2. Digging Deeper Into HIP Quality
        • 3. Hot Topics in Higher Ed
      • Past Annual Results
    • Publications & Presentations
      • Foundational Publications
      • Featured Publications
      • Recent Presentations
      • Lessons from the Field
      • DEEP Practice Briefs
      • Search
    • NSSE Essentials
    • NSSE Sightings (blog)
      • Search Posts
  • Institution Login
  • BLOG
  • Contact Us

Our Research: Projects,
Publications, and More

  • Home
  • NSSE
    • About NSSE
    • Survey Instruments
    • Registration Details
    • Administering NSSE
    • Reports & Data
    • Working with NSSE Data
    • Psychometric Portfolio
    • NSSE Shorts (New)
  • FSSE
    • FSSE Portal Log-in
    • About FSSE
    • Survey Instruments
    • Registration & Pricing
    • Administering FSSE
    • Findings, Data, & Reports
    • Working with FSSE data
    • Psychometric Portfolio
  • BCSSE
    • About BCSSE
    • BCSSE Survey
    • Fall Check-In
    • Registration & Pricing
    • Administering BCSSE
    • Reports & Data
    • Working with BCSSE data
    • Dashboard Log-in
  • Support & Resources
    • For Participating Institutions
    • For Partnerships
    • For Students & Parents
    • For All Audiences
  • Research
    • Annual Results
    • Publications & Presentations
    • NSSE Essentials
    • NSSE Sightings (blog)
  • Search
  • Institution Login
  • BLOG
  • Contact Us
  • Home
  • Research
  • NSSE Sightings (blog)

Extreme Responses Differ Across Demographic Groups

Monday, May 15, 2017

Xiaolin Wang, Amy Ribera, Bob Gonyea — Messick (1989) defines validity as an "integrative evaluative judgment of the degree to which empirical evidence and theoretical rationales support the...appropriateness of interpretations and actions based on...scores or other modes of assessment" (p. 1487). One of the many methods to build a case for the validity of an instrument is by examining the stylistic tendencies of survey respondents, also known as response styles (Cronbach, 1946).

Survey respondents are known to follow many types of response styles, such as the tendency to unconditionally agree with the items and the tendency to select middle points on a Likert scale (Baumgartner & Steenkamp, 2001). Among the different response styles, the tendency to select endpoints on a Likert scale is known as extreme response style (ERS) (Greenleaf, 1992). For example, on a five-point Likert scale ranging from "strongly disagree" to "strongly agree," ERS refers to the tendency to select the endpoints, "strongly disagree" or "strongly agree," as opposed to the middle points ranging from "disagree" to "agree". Response styles such as ERS could contaminate group comparison results and lead to incorrect conclusions. With the existence of ERS, group score differences are mixtures of both true response difference and response style difference, which threatens survey validity since it is measuring more than the construct of interest. Since responses from surveys are commonly used for group comparisons, an understanding of whether ERS exists and if it potentially contaminates group comparison results would be very meaningful.

To investigate ERS in higher education assessment, we applied a generalized IRT-ERS (Jin & Wang, 2014) model to analyze the responses of 22,450 senior college students who participated in the National Survey of Student Engagement in 2014. We isolated our analysis to students from 71 four-year institutions that opted to administer the module item set, Global Perspectives Inventory. After estimating each individual's ERS parameter using the Markov chain Monte Carlo estimation method, we further compared the ERS of these students in order to reveal group differences by eight demographic factors (gender, enrollment status, international status, first-generation status, race and ethnicity, STEM major, sexual orientation, and disability status). Based on group comparison results from t-tests and F-tests, we found significant (p<.05) ERS tendency differences for two demographic factors: STEM major and first-generation status. Specifically, STEM students and non-first generation students were more likely to select either "Strongly Agree" or "Strongly Disagree" over "Agree", "Disagree", or "Neither".

We recommend that, when evaluating the validity of score interpretations, researchers should consider assessing survey response style effects through an IRT lens. In addition, we suggest that researchers examine whether different demographic groups show different ERS tendencies in order to better interpret the data. The issue of ERS is especially important when the survey results are utilized for policy or high-stakes decision making: rampant ERS exaggerates differences, and thus might draw attention where it may be unwarranted.

For more information you may read the full paper, presented at the 2017 American Educational Research Association in San Antonio, TX, here.

Tweet
  • Annual Results
  • Publications & Presentations
  • NSSE Essentials
  • NSSE Sightings (blog)
    • Search Posts

Evidence-Based Improvement in Higher Education resources and social media channels

  • Twitter
  • LinkedIn
  • Facebook
  • Instagram
  • YouTube

RELATED SITES

  • Center for Postsecondary Research
  • Indiana University Bloomington School of Education

Evidence-Based Improvement in Higher Education

Center for Postsecondary Research
Indiana University School of Education
201 N. Rose Avenue
Bloomington, IN 47405-1006
Phone: 812.856.5824
Contact Us


NSSE, FSSE, BCSSE, and the column logo are registered with the U.S. Patent and Trademark Office.
Accessibility | Privacy Notice | Copyright © 2021 The Trustees of Indiana University

Center for Postsecondary Research | Indiana University Bloomington School of Education