• Skip to Content
  • Skip to Main Navigation
  • Skip to Search

NSSE logo

NSSENSSENSSE

Open Search
  • NSSE
    • About NSSE
      • Conceptual Framework
      • Positions and Policies
      • Advisors
      • Partners
      • Employment
    • Survey Instruments
      • Topical Modules
      • Engagement Indicators
      • High-Impact Practices
    • Registration Details
      • Pricing
      • Campus Contacts
      • Administration Checklist
    • Administering NSSE
      • Population File
      • Customizing NSSE
      • Recruitment Method
        • Folder Name
      • Encouraging Participation
      • Terms of Participation
      • Accessibility
      • Data Security
    • Reports & Data
      • NSSE Administration Overview
      • Data Files
      • Sample Report
      • Institutional Report Guide
      • Report Builder
      • Data Summaries & Interactive Displays
    • Working with NSSE Data
      • Data Codebooks
      • Syntax
    • Psychometric Portfolio
      • Response Rates
    • NSSE Shorts (New)
      • Pricing and Registration
      • Administration Instructions
      • Item Sets
      • Reports & Data Use
      • Dashboard LogIn
  • FSSE
    • FSSE Portal Log-in
    • About FSSE
    • Survey Instruments
      • Main Survey
        • FSSE Scales
        • Disciplinary Areas
      • Topical Modules
      • Consortium Questions
    • Registration & Pricing
    • Administering FSSE
      • Administration Overview
      • Confidentiality
      • Customization
        • Preparing for Message Delivery
        • Population File Instructions
      • Data Security
      • IRB Protocol
        • Informed Consent
      • Locating Your Data & Results
      • Sample Reports
        • Administration Summary
        • FSSE Respondent Profile
        • FSSE Topical Module Report
        • FSSE Disciplinary Area Report
        • FSSE-NSSE Combined Report
        • FSSE Frequency Report
        • FSSE Snapshot Report
      • Terms of Participation
    • Findings, Data, & Reports
      • FSSE Overview
      • Content Summaries
      • Data Visualizations
      • Data Use Examples
    • Working with FSSE data
      • Using FSSE Data
      • Analysis Resources
      • Data User's Guide
    • Psychometric Portfolio
  • BCSSE
    • About BCSSE
    • BCSSE Survey
      • BCSSE Scales
    • Fall Check-In
    • Registration & Pricing
    • Administering BCSSE
      • Administration Protocol and Procedures
      • BCSSE Contacts
      • Demonstration Dashboard Portal
      • Institution Participation Agreement
      • IRB
      • Data Security and Accessibility
    • Reports & Data
      • BCSSE Overview
      • Accessing BCSSE Data
      • Summary Tables
    • Working with BCSSE data
      • Additional Resources
    • Dashboard Log-in
  • Support & Resources
    • For Participating Institutions
      • How Institutions Use Their Data
        • Lessons from the Field
          • Institution Examples
        • NSSE Data Use in Brief
        • Search for examples
        • Displaying Results
        • Data Use Teams
      • Data & Results Guides
        • Student Success Mapping
        • Navigating Your Institutional Report
        • Tips for More Inclusive Data Sharing and Analysis
        • Data User's Guide: Sense of Belonging
        • Accreditation Toolkits
        • Sharing and Disseminating NSSE Results
        • NSSE Data User’s Guide
        • Campuswide Mapping
        • Contextualizing NSSE Effect Sizes
        • Custom Analysis
      • Workshops and Webinars
    • For Partnerships
      • Special Projects
    • For Students & Parents
      • Pocket Guide
        • English
        • Spanish
    • For All Audiences
  • Research
    • Annual Results
      • Annual Results 2023
        • Special Report 1
        • Special Report 2
      • Annual Results 2022
        • 1. Rebounding Engagement
        • 2. Digging Deeper Into HIP Quality
        • 3. Hot Topics in Higher Ed
      • Past Annual Results
    • Publications & Presentations
      • Foundational Publications
      • Featured Publications
      • Recent Presentations
      • Lessons from the Field
      • DEEP Practice Briefs
      • Search
    • NSSE Essentials
    • NSSE Sightings (blog)
      • Search Posts
  • Institution Login
  • BLOG
  • Contact Us

Our Research: Projects,
Publications, and More

  • Home
  • NSSE
    • About NSSE
    • Survey Instruments
    • Registration Details
    • Administering NSSE
    • Reports & Data
    • Working with NSSE Data
    • Psychometric Portfolio
    • NSSE Shorts (New)
  • FSSE
    • FSSE Portal Log-in
    • About FSSE
    • Survey Instruments
    • Registration & Pricing
    • Administering FSSE
    • Findings, Data, & Reports
    • Working with FSSE data
    • Psychometric Portfolio
  • BCSSE
    • About BCSSE
    • BCSSE Survey
    • Fall Check-In
    • Registration & Pricing
    • Administering BCSSE
    • Reports & Data
    • Working with BCSSE data
    • Dashboard Log-in
  • Support & Resources
    • For Participating Institutions
    • For Partnerships
    • For Students & Parents
    • For All Audiences
  • Research
    • Annual Results
    • Publications & Presentations
    • NSSE Essentials
    • NSSE Sightings (blog)
  • Search
  • Institution Login
  • BLOG
  • Contact Us
  • Home
  • Research
  • NSSE Sightings (blog)

Using Effect Coding to Study Minoritized Populations

Steven Feldman

Friday, April 28, 2023

 Mercer University

In my work, I am continually trying to find new ways to research marginalized communities in ways that are inclusive and reflective of current best practices. In particular, I have often struggled with quantitative research, which often necessitates grouping people into boxes and operationalizing rigid definitions of identity. Similarly, I have struggled in accepting conventional norms in quantitative research that teach analytical methods that compare marginalized communities as a conglomerate to their privileged counterparts (e.g., people of color vs. White, LGBQ vs. Straight, transgender vs. cisgender). Recently, I learned about effect coding as a way of mitigating some of my concerns (thank you NSSE/FSSE/CUTE Research Scientist Allison BrckaLorenz!). 

Put simply, effect coding allows researchers to compare groups of data to the average (mean) of the sample, as opposed to another group of data (Mayhew & Simonoff, 2015). This contrasts with indicator (dummy) coding, which uses a single group as a reference group. This is particularly useful when comparing groups of participants, separated by identity. For example, in indicator coding, a researcher might use white students as the standard by which all other racial groups are then compared with. However, this can be problematic in that it privileges the narrative or experiences of a single group of people. Through effect coding, rather than comparing racially marginalized groups to white people, all selected racial groups are compared to the average score between them. 

To be sure, effect coding is not perfect. For instance, if a sample of 5,000 students has 4,000 white students and 1,000 students of color, the average of the sample would still lean towards the scores for white students, and therefore might still privilege white people in the analysis. Nonetheless, by removing some of the inherent power dynamics from the start, effect coding can serve as a more inclusive method of data analysis. 

In my own research studying LGBTQ communities, this is especially helpful so as to not perpetuate the privileging of cisgender and straight voices in my analyses. Generally, in my work, I am less interested in comparing LGBTQ students with non-LGBTQ students and more interested in studying within-group differences. And while effect coding certainly feels more appropriate for me to use, I also recognize that bisexual students make up an increasingly large majority of the LGBTQ community and so any comparison to the average would skew towards the scores of bisexual students. Nonetheless, this feels more appropriate than the typical alternative: drawing comparisons toward gay and lesbian students as the reference group. 

As another example, Allison BrckaLorenz, Ella Chamis (a FSSE/CUTE Research Project Associate), and I conducted a study, which used data from the College + University Teaching Environment (CUTE) Survey in order to examine the affective components of a faculty environment for queer faculty, faculty of color, and queer faculty of color (BrckaLorenz et al., 2023). In this project, rather than compare the coefficients for white faculty with scores for faculty of color and the coefficients for straight faculty with scores for queer faculty, we effect coded the demographic variables so that each identity could be compared to the average score for faculty in the model. In our view, this was the most inclusive way to ensure that each racial and sexual identity was treated as equally as possible in our statistical model, without affording more power or privilege to a single identity. 

At the end of the day, effect coding is one tool in the toolbox of analytical methods. While effect coding is not perfect, nor is it the solution to every research question, it can offer individuals a way to make critical decisions about the statistical models they wish to create in order to study minoritized populations. And in quantitative research, a little care and intentionality can go a long way. 

References 

BrckaLorenz, A., Chamis, E., & Feldman, S. (2023, April). Faculty feelings matter: Environmental experiences of queer faculty of color [paper presentation]. Annual Meeting of the AERA (American Educational Research Association), Chicago, IL. 

Mayhew, M. J., & Simonoff, J. S. (2015). Non-white, no more: Effect coding as an alternative to dummy coding with implications for higher education researchers. Journal of College Student Development, 56(2), 170–175. 

  • Annual Results
  • Publications & Presentations
  • NSSE Essentials
  • NSSE Sightings (blog)
    • Search Posts

Evidence-Based Improvement in Higher Education resources and social media channels

  • Twitter
  • LinkedIn
  • Facebook
  • Instagram
  • YouTube

RELATED SITES

  • Center for Postsecondary Research
  • Indiana University Bloomington School of Education

Evidence-Based Improvement in Higher Education

Center for Postsecondary Research
Indiana University School of Education
201 N. Rose Avenue
Bloomington, IN 47405-1006
Phone: 812.856.5824
Contact Us


NSSE, FSSE, BCSSE, and the column logo are registered with the U.S. Patent and Trademark Office.
Accessibility | Privacy Notice | Copyright © 2021 The Trustees of Indiana University

Center for Postsecondary Research | Indiana University Bloomington School of Education