• Skip to Content
  • Skip to Main Navigation
  • Skip to Search

NSSE logo

NSSENSSENSSE

Open Search
  • NSSE
    • About NSSE
      • Conceptual Framework
      • Positions and Policies
      • Advisors
      • Partners
      • Employment
    • Survey Instruments
      • Topical Modules
      • Engagement Indicators
      • High-Impact Practices
    • Registration Details
      • Pricing
      • Campus Contacts
      • Administration Checklist
    • Administering NSSE
      • Population File
      • Customizing NSSE
      • Recruitment Method
        • Folder Name
      • Encouraging Participation
      • Terms of Participation
      • Accessibility
      • Data Security
    • Reports & Data
      • NSSE Administration Overview
      • Data Files
      • Sample Report
      • Institutional Report Guide
      • Report Builder
      • Data Summaries & Interactive Displays
    • Working with NSSE Data
      • Data Codebooks
      • Syntax
    • Psychometric Portfolio
      • Response Rates
    • NSSE Shorts (New)
      • Pricing and Registration
      • Administration Instructions
      • Item Sets
      • Reports & Data Use
      • Dashboard LogIn
  • FSSE
    • FSSE Portal Log-in
    • About FSSE
    • Survey Instruments
      • Main Survey
        • FSSE Scales
        • Disciplinary Areas
      • Topical Modules
      • Consortium Questions
    • Registration & Pricing
    • Administering FSSE
      • Administration Overview
      • Confidentiality
      • Customization
        • Preparing for Message Delivery
        • Population File Instructions
      • Data Security
      • IRB Protocol
        • Informed Consent
      • Locating Your Data & Results
      • Sample Reports
        • Administration Summary
        • FSSE Respondent Profile
        • FSSE Topical Module Report
        • FSSE Disciplinary Area Report
        • FSSE-NSSE Combined Report
        • FSSE Frequency Report
        • FSSE Snapshot Report
      • Terms of Participation
    • Findings, Data, & Reports
      • FSSE Overview
      • Content Summaries
      • Data Visualizations
      • Data Use Examples
    • Working with FSSE data
      • Using FSSE Data
      • Analysis Resources
      • Data User's Guide
    • Psychometric Portfolio
  • BCSSE
    • About BCSSE
    • BCSSE Survey
      • BCSSE Scales
    • Fall Check-In
    • Registration & Pricing
    • Administering BCSSE
      • Administration Protocol and Procedures
      • BCSSE Contacts
      • Demonstration Dashboard Portal
      • Institution Participation Agreement
      • IRB
      • Data Security and Accessibility
    • Reports & Data
      • BCSSE Overview
      • Accessing BCSSE Data
      • Summary Tables
    • Working with BCSSE data
      • Additional Resources
    • Dashboard Log-in
  • Support & Resources
    • For Participating Institutions
      • How Institutions Use Their Data
        • Lessons from the Field
          • Institution Examples
        • NSSE Data Use in Brief
        • Search for examples
        • Displaying Results
        • Data Use Teams
      • Data & Results Guides
        • Student Success Mapping
        • Navigating Your Institutional Report
        • Tips for More Inclusive Data Sharing and Analysis
        • Data User's Guide: Sense of Belonging
        • Accreditation Toolkits
        • Sharing and Disseminating NSSE Results
        • NSSE Data User’s Guide
        • Campuswide Mapping
        • Contextualizing NSSE Effect Sizes
        • Custom Analysis
      • Workshops and Webinars
    • For Partnerships
      • Special Projects
    • For Students & Parents
      • Pocket Guide
        • English
        • Spanish
    • For All Audiences
  • Research
    • Annual Results
      • Annual Results 2023
        • Special Report 1
        • Special Report 2
      • Annual Results 2022
        • 1. Rebounding Engagement
        • 2. Digging Deeper Into HIP Quality
        • 3. Hot Topics in Higher Ed
      • Past Annual Results
    • Publications & Presentations
      • Foundational Publications
      • Featured Publications
      • Recent Presentations
      • Lessons from the Field
      • DEEP Practice Briefs
      • Search
    • NSSE Essentials
    • NSSE Sightings (blog)
      • Search Posts
  • Institution Login
  • BLOG
  • Contact Us

Our Research: Projects,
Publications, and More

  • Home
  • NSSE
    • About NSSE
    • Survey Instruments
    • Registration Details
    • Administering NSSE
    • Reports & Data
    • Working with NSSE Data
    • Psychometric Portfolio
    • NSSE Shorts (New)
  • FSSE
    • FSSE Portal Log-in
    • About FSSE
    • Survey Instruments
    • Registration & Pricing
    • Administering FSSE
    • Findings, Data, & Reports
    • Working with FSSE data
    • Psychometric Portfolio
  • BCSSE
    • About BCSSE
    • BCSSE Survey
    • Fall Check-In
    • Registration & Pricing
    • Administering BCSSE
    • Reports & Data
    • Working with BCSSE data
    • Dashboard Log-in
  • Support & Resources
    • For Participating Institutions
    • For Partnerships
    • For Students & Parents
    • For All Audiences
  • Research
    • Annual Results
    • Publications & Presentations
    • NSSE Essentials
    • NSSE Sightings (blog)
  • Search
  • Institution Login
  • BLOG
  • Contact Us
  • Home
  • Research
  • NSSE Sightings (blog)

Measuring Achievement Goal Orientation Across Student Subgroups

Angie Miller

Wednesday, June 22, 2022

 Photo courtesy of Austin College

The Journal of Advanced Academics recently published an “online first” article exploring achievement goal orientation for a specific student subgroup: those participating in an honors college or program. This paper was authored by NSSE research scientist Angie Miller and used experimental item data from the 2015 NSSE administration.

Achievement goal orientation is a longstanding topic of interest within educational psychology (Elliot & Hulleman, 2017). While self-report measures of the construct have been developed and validated using convenience samples of undergraduates (Elliot, 2006; Elliot & Murayama, 2008), the practical implications are frequently focused on K-12 students and classrooms. One research sub-area has focused on gifted K-12 students and comparisons to their general education peers, with overall mixed findings when it comes to whether or not gifted students are higher or lower in certain orientations (Huang, 2012) and some question over whether the mixed findings are due to issues with the instruments used to measure it (Dai et al., 1998).

This study focuses on achievement goal orientation within honors students, first exploring whether a commonly used self-report measure of the construct has adequate factor structure across subgroups and then implementing two different statistical analyses to look for differences between honors students and their general education peers. Specifically, these four types of achievement goal orientations were examined:

  • mastery-approach, where the goal is attaining task-based or intrapersonal competence (i.e. learning as much about the topic as possible);
  • performance-approach, where the goal is attaining normative competence (i.e. getting a better grade than everyone else);
  • mastery-avoidance, where the goal is avoiding task-based or intrapersonal incompetence (i.e. not wanting to miss out on important information about a topic); and
  • performance-avoidance, where the goal is avoiding normative incompetence (i.e. not looking “stupid” in front of others)

Data were drawn from over 3,900 first-year and 5,200 senior college students across 15 higher education institutions participating in the 2015 National Survey of Student Engagement (NSSE), using an experimental item set that included a well-known measure of achievement goal orientation (the AGQ-R). In this measure, each of the four orientations has an accompanying subscale.

First, a confirmatory factor analysis was done to determine whether the four subscales had good model fit, based on several different indices, for first-year honors students, senior honors students, first-year general education students, and senior general education students. The fit indices and item loadings suggested that there was good factor structure across all of the groups, meaning that the measure was appropriate to use with both honors and general education students.

Next, independent samples t-tests compared honors and general education students on the four achievement goal orientations. Here, the results suggested that there were some statistically significant differences. First-year honors students were higher than first-year general education students on performance-approach and mastery-approach goal orientations, and senior honors students were higher than senior general education students on performance-approach goal orientation. However, the effect sizes for these differences were all small in magnitude.

Although the results of the independent-samples t-tests indicated a few significant differences for honors students on certain achievement goal orientations, it is also possible that the driving force behind these differences is the other characteristics that differ between honors and general education students. To explore this further, regression analysis was implemented. A series of eight OLS regression models were run separately for first-year students and seniors featuring each of the four AGQ-R subscales as the outcome variables. The following covariates were included in the models to control for variation not due to honors participation: sex, transfer status, enrollment status, first-generation status, age, SAT/ACT score, institutional control, enrollment size, race/ethnicity, major field, grades, and online-learner status. After accounting for the variation explained by the above characteristics, the models found no significant effects for honors participation on any of the achievement goal orientation subscales, in contrast to the findings from the t-tests.

In terms of the general takeaways from this article, the first is that researchers should continue to investigate achievement goal orientation in samples of both honors and general education students, and the AGQ-R seems to be an adequate instrument for these groups. It is important to replicate results within different contexts and populations as part of generating scientific knowledge, and it is equally important to pursue and support the publication of results that are not statistically significant. The second takeaway concerns the need to use control variables in statistical models when they are available in the data. If this study had been limited to the results from the t-test means comparisons, the conclusions about the connections between achievement goal orientation and ability level would differ. The lack of sufficient control variables in previous studies might actually be the root of the mixed findings. The field should continue to investigate how K-12 models and constructs function for undergraduate students, taking care to use appropriate measures and statistical approaches to address these research questions.

Reference

Miller, A. L. (2022). Reconsidering achievement goal orientation for honors college students. Journal of Advanced Academics, vol and pp TBD. Online first availability: https://doi.org/10.1177/1932202X221086139

 

 

  • Annual Results
  • Publications & Presentations
  • NSSE Essentials
  • NSSE Sightings (blog)
    • Search Posts

Evidence-Based Improvement in Higher Education resources and social media channels

  • Twitter
  • LinkedIn
  • Facebook
  • Instagram
  • YouTube

RELATED SITES

  • Center for Postsecondary Research
  • Indiana University Bloomington School of Education

Evidence-Based Improvement in Higher Education

Center for Postsecondary Research
Indiana University School of Education
201 N. Rose Avenue
Bloomington, IN 47405-1006
Phone: 812.856.5824
Contact Us


NSSE, FSSE, BCSSE, and the column logo are registered with the U.S. Patent and Trademark Office.
Accessibility | Privacy Notice | Copyright © 2021 The Trustees of Indiana University

Center for Postsecondary Research | Indiana University Bloomington School of Education