• Skip to Content
  • Skip to Main Navigation
  • Skip to Search

NSSE logo

NSSENSSENSSE

Open Search
  • NSSE
    • About NSSE
      • Conceptual Framework
      • Positions and Policies
      • Advisors
      • Partners
      • Employment
    • Survey Instruments
      • Topical Modules
      • Engagement Indicators
      • High-Impact Practices
    • Registration Details
      • Pricing
      • Campus Contacts
      • Administration Checklist
    • Administering NSSE
      • Population File
      • Customizing NSSE
      • Recruitment Method
        • Folder Name
      • Encouraging Participation
      • Terms of Participation
      • Accessibility
      • Data Security
    • Reports & Data
      • NSSE Administration Overview
      • Data Files
      • Sample Report
      • Institutional Report Guide
      • Report Builder
      • Data Summaries & Interactive Displays
    • Working with NSSE Data
      • Data Codebooks
      • Syntax
    • Psychometric Portfolio
      • Response Rates
    • NSSE Shorts (New)
      • Pricing and Registration
      • Administration Instructions
      • Item Sets
      • Reports & Data Use
      • Dashboard LogIn
  • FSSE
    • FSSE Portal Log-in
    • About FSSE
    • Survey Instruments
      • Main Survey
        • FSSE Scales
        • Disciplinary Areas
      • Topical Modules
      • Consortium Questions
    • Registration & Pricing
    • Administering FSSE
      • Administration Overview
      • Confidentiality
      • Customization
        • Preparing for Message Delivery
        • Population File Instructions
      • Data Security
      • IRB Protocol
        • Informed Consent
      • Locating Your Data & Results
      • Sample Reports
        • Administration Summary
        • FSSE Respondent Profile
        • FSSE Topical Module Report
        • FSSE Disciplinary Area Report
        • FSSE-NSSE Combined Report
        • FSSE Frequency Report
        • FSSE Snapshot Report
      • Terms of Participation
    • Findings, Data, & Reports
      • FSSE Overview
      • Content Summaries
      • Data Visualizations
      • Data Use Examples
    • Working with FSSE data
      • Using FSSE Data
      • Analysis Resources
      • Data User's Guide
    • Psychometric Portfolio
  • BCSSE
    • About BCSSE
    • BCSSE Survey
      • BCSSE Scales
    • Fall Check-In
    • Registration & Pricing
    • Administering BCSSE
      • Administration Protocol and Procedures
      • BCSSE Contacts
      • Demonstration Dashboard Portal
      • Institution Participation Agreement
      • IRB
      • Data Security and Accessibility
    • Reports & Data
      • BCSSE Overview
      • Accessing BCSSE Data
      • Summary Tables
    • Working with BCSSE data
      • Additional Resources
    • Dashboard Log-in
  • Support & Resources
    • For Participating Institutions
      • How Institutions Use Their Data
        • Lessons from the Field
          • Institution Examples
        • NSSE Data Use in Brief
        • Search for examples
        • Displaying Results
        • Data Use Teams
      • Data & Results Guides
        • Student Success Mapping
        • Navigating Your Institutional Report
        • Tips for More Inclusive Data Sharing and Analysis
        • Data User's Guide: Sense of Belonging
        • Accreditation Toolkits
        • Sharing and Disseminating NSSE Results
        • NSSE Data User’s Guide
        • Campuswide Mapping
        • Contextualizing NSSE Effect Sizes
        • Custom Analysis
      • Workshops and Webinars
    • For Partnerships
      • Special Projects
    • For Students & Parents
      • Pocket Guide
        • English
        • Spanish
    • For All Audiences
  • Research
    • Annual Results
      • Annual Results 2023
        • Special Report 1
        • Special Report 2
      • Annual Results 2022
        • 1. Rebounding Engagement
        • 2. Digging Deeper Into HIP Quality
        • 3. Hot Topics in Higher Ed
      • Past Annual Results
    • Publications & Presentations
      • Foundational Publications
      • Featured Publications
      • Recent Presentations
      • Lessons from the Field
      • DEEP Practice Briefs
      • Search
    • NSSE Essentials
    • NSSE Sightings (blog)
      • Search Posts
  • Institution Login
  • BLOG
  • Contact Us

Our Research: Projects,
Publications, and More

  • Home
  • NSSE
    • About NSSE
    • Survey Instruments
    • Registration Details
    • Administering NSSE
    • Reports & Data
    • Working with NSSE Data
    • Psychometric Portfolio
    • NSSE Shorts (New)
  • FSSE
    • FSSE Portal Log-in
    • About FSSE
    • Survey Instruments
    • Registration & Pricing
    • Administering FSSE
    • Findings, Data, & Reports
    • Working with FSSE data
    • Psychometric Portfolio
  • BCSSE
    • About BCSSE
    • BCSSE Survey
    • Fall Check-In
    • Registration & Pricing
    • Administering BCSSE
    • Reports & Data
    • Working with BCSSE data
    • Dashboard Log-in
  • Support & Resources
    • For Participating Institutions
    • For Partnerships
    • For Students & Parents
    • For All Audiences
  • Research
    • Annual Results
    • Publications & Presentations
    • NSSE Essentials
    • NSSE Sightings (blog)
  • Search
  • Institution Login
  • BLOG
  • Contact Us
  • Home
  • Research
  • NSSE Sightings (blog)

Are Faculty Surveys Biased?

Angie Miller and Amber Dumford

Monday, July 03, 2017

A recently presented paper at the American Educational Research Association used Faculty Survey of Student Engagement (FSSE) data to investigate the impact of social desirability bias on faculty survey responses.  Social desirability bias, or the tendency for survey respondents to provide answers that cast them in a favorable manner, has long been a concern for surveys of sensitive or taboo topics like drug use or sexual behaviors.  However, there is mixed evidence for the presence of social desirability bias in student self-report surveys, including previous research looking at social desirability bias on NSSE (Miller, 2012).  Data from a subsample of faculty at 18 institutions participating in the 2014 FSSE administration suggested that, in general, social desirability bias does not have a major effect on survey responses.  However, faculty responses on the Effective Teaching Practices Scale may be somewhat influenced by social desirability bias.  This may be due to the similarity between the items in this scale and those found on course evaluations.  Recent controversy surrounding bias in course evaluations, as well as their high-stakes nature, might portray these seemingly innocuous items as contentious for some faculty.

While surveys can easily gather large amounts of data, the use of self-reports sometimes leads to concerns about the data quality. To minimize the potential that certain questions will prompt untruthful answers as respondents attempt to provide a socially appropriate response, researchers can examine whether social desirability bias is present in the data. Although encouraging student engagement is not what one might consider a "sensitive" topic, faculty may be aware that answering items in ways that display higher levels of engagement is desired by their institutions and they want to appear to be "good" employees. Therefore, the current study was developed to address the issue of social desirability bias and self-reported engagement behaviors at the faculty level.

For this study, data from the 2014 FSSE administration was used. In addition to the core survey (including FSSE scales and faculty demographics), a sub-sample of 1,574 respondents completed additional experimental items on social desirability (Ray, 1984).  While this was a subset of institutions that participated in FSSE, they were selected by random assignment and the resulting 18 institutions mirrored the overall national landscape when looking at size, Carnegie classification, and control.

A series of ten ordinary least squares (OLS) regression analyses, controlling for certain faculty and institutional characteristics, were conducted. Results from the regression models suggest that in all cases, social desirability bias does not seem to be a major factor in faculty members' responses to the questions involved in the FSSE scales. For four out of the 10 models, the effect of social desirability is not statistically significant, meaning that social desirability bias is not having an influence on the responses. In the remaining six cases, while there was statistical significance, the sizes of the coefficients suggest that the effects are not practically significant, meaning that a slight influence might be present but is not having a substantive impact on the responses.  Furthermore, the change in explained variance for the models when the social desirability score was entered as the second step was quite small as well, even for the statistically significant models.  This suggests that the other variables in the models are having a much greater influence than that of social desirability.

The only model that had cause for further consideration for bias was the model with Effective Teaching Practices as the outcome variable.  This scale was the most predicted by social desirability scores.  Although still small in magnitude (? = .220), this relationship might be partially explained by the similarity between these items and ones found on course evaluations at many institutions. Faculty might be more likely to over-report how often they do things like "clearly explain course goals and requirements" and "provide prompt and detailed feedback on tests or completed assignments" because when these items are asked of students in the context of course evaluations, there are higher stakes associated with the results. While this is a possible concern for interpreting results from faculty surveys, it should be noted that the practical significance of this connection is low.

For more information, please see the social desirability report in FSSE's Psychometric Portfolio.

Miller, A.L., & Dumford, A.D.  (2017, April).  Social desirability bias and faculty respondents: Is "good behavior" harming survey results? Paper presented at the Annual Meeting of the American Educational Research Association, San Antonio, Texas.

  • Annual Results
  • Publications & Presentations
  • NSSE Essentials
  • NSSE Sightings (blog)
    • Search Posts

Evidence-Based Improvement in Higher Education resources and social media channels

  • Twitter
  • LinkedIn
  • Facebook
  • Instagram
  • YouTube

RELATED SITES

  • Center for Postsecondary Research
  • Indiana University Bloomington School of Education

Evidence-Based Improvement in Higher Education

Center for Postsecondary Research
Indiana University School of Education
201 N. Rose Avenue
Bloomington, IN 47405-1006
Phone: 812.856.5824
Contact Us


NSSE, FSSE, BCSSE, and the column logo are registered with the U.S. Patent and Trademark Office.
Accessibility | Privacy Notice | Copyright © 2021 The Trustees of Indiana University

Center for Postsecondary Research | Indiana University Bloomington School of Education