Search again
Publications
The dependability of the updated NSSE: A generalizability study
Fosnacht, K., & Gonyea, R. M.
Research and Practice in Assessment, 13(Summer/Fall), 62–74, 2018.
This study utilized generalizability theory to assess the context where the National Survey of Student Engagement‘s (NSSE) summary measures, the Engagement Indicators, produce dependable group-level means. The dependability of NSSE group means is an important topic for the higher education assessment community given its wide utilization and usage in institutional assessment and accreditation. We found that the Engagement Indicators produced dependable group means for an institution derived from samples as small as 25 to 50 students. Furthermore, we discuss how the assessment community should use NSSE data.
Full version
Independent colleges and student engagement: Descriptive analysis by institutional type
Gonyea, R. M., & Kinzie, J.
Washington, DC: Council of Independent Colleges, 2015.
Critics of traditional, residential, liberal arts colleges and universities contend that this form of higher education is outmoded, too costly, and no longer educationally relevant for 21st century students. Economies of scale, large classes taught by contingent faculty members and graduate students, and increasing reliance on technology and online learning, so the argument goes, are the only cost-effective means of meeting the educational challenges of the future. This report, prepared for the Council of Independent Colleges (CIC), draws on the most current NSSE data, from 2013 and 2014, that include more than 540,000 first-year and senior students enrolled at more than 900 four-year colleges and universities. Findings are presented with comparisons across four institutional types: (1) baccalaureate and master‘s level private institutions (CIC‘s predominant membership profile), (2) baccalaureate and master‘s level public institutions, (3) doctoral private institutions, and (4) doctoral public institutions. Included in the analysis are measures from the updated NSSE that includes ten new Engagement Indicators, six High-Impact Practices, the Perceived Gains scale, and a Satisfaction scale. Findings from this study affirm the effectiveness of independent colleges and universities for undergraduate student learning. Students at private institutions are more likely to be engaged in educationally effective experiences than their peers at public institutions. Areas of distinction in the private institution undergraduate experience include a more academically challenging education, better relations with faculty members, more substantial interactions with others on campus, and the consistent perception that students have learned and grown more, in comparison with public institutions.
Full version
Scholarly Papers
Peering into the box of grit: How does grit influence the engagement of undergraduates?
Fosnacht, K., Copridge, K., & Sarraf, S.
Association for the Study of Higher Education Annual Conference, Houston, TX, 2017, November.
Angela Duckworth‘s concept of grit has become a popular way for admissions leaders to incorporate non-cognitive traits into admissions decisions. Despite this popularity, the validity of grit has been questioned by numerous scholars. This study investigated the construct and concurrent validity of the Short Grit Scale (Grit-S) using a large multi-institutional sample of first year and senior students. It also examined the measurement invariance of Grit-S to examine if the scale varied across populations. The results indicate that the criterion validity of Grit-S is not suitable for usage in high-stakes situations. However, the scale seems to be relatively invariant across important subgroups. The concurrent validity analyses revealed that one dimension of grit, perseverance of effort, was significantly and positively correlated to the NSSE Engagement Indicators, a perceived gains scale, time spent studying, and GPA. However, the second dimension of grit was frequently negatively related to the same measures.
Full version
Social desirability bias and faculty respondents: Is ?good behavior? harming survey results?
Miller, A. L., & Dumford, A. D.
American Educational Research Association Annual Meeting, San Antonio, TX, 2017, April.
Social desirability bias has long been a concern for survey researchers, with mixed findings for student self-reports in higher education. To extend this research, the current study investigates the potential presence of social desirability bias in faculty surveys. In addition to the core Faculty Survey of Student Engagement, this study used responses from a social desirability measure given to 1,574 faculty members at 18 institutions. A series of 10 step-wise OLS regression analyses looking at engagement indicators (while controlling for other faculty and institutional characteristics) suggest that in all cases, social desirability bias does not seem to be a major factor in faculty survey responses. However, it is also important to consider how some items are ?sensitive? for specific populations.
Full version
Maintaining inequality: An analysis of college pathways among women at large public institutions
Tukibayeva, M., Ribera, A. K., Nelson Laird, T. F., & BrckaLorenz, A.
American Educational Research Association Annual Meeting, Washington, DC, 2016, April.
Armstrong and Hamilton (2013) proposed a framework of three college pathways?party, professional, and mobility?that lead to economically unequal postgraduation outcomes and vastly different college experiences for female students. Using data from the National Survey of Student Engagement (NSSE), we examined the responses from 42,504 women seniors at 183 four-year large public institutions to identify how the potential income of their college major choice relate to the pathways. We found that the economic advantage of major choice is not equally distributed among students: party pathway students selected the least lucrative college majors, professional pathway students selected the most lucrative majors, and first-generation students on all pathways tended to select majors with less potential income than their peers with college-educated parents. Students on the three pathways also engaged differently in three measures of academic engagement (three of the ten NSSE Engagement Indicators): Reflective and Integrative Learning, Learning Strategies, and Student-Faculty Interaction.
Full version
Learning online: Unintended consequences for engagement?
Dumford, A. D., & Miller, A. L.
Hawaii International Conference on Education, Honolulu, HI, 2016, January.
A rapidly increasing number of colleges and universities are looking for ways to deliver course content online. This paper investigates the effects of taking courses through an online medium on students‘ engagement using data from the 2015 administration of the National Survey of Student Engagement (NSSE). A series of 10 OLS regression analyses, controlling for certain student and institutional characteristics, suggested several significant effects of taking online courses for first-year students as well as seniors. Students taking more courses using an online medium showed higher use of learning strategies and quantitative reasoning yet lower collaborative learning, student-faculty interactions, effective teaching practices, discussions with diverse others, and quality of interactions. The change in these engagement indicators based on the percentage of classes taken online reveals that the online environment might encourage certain types of engagement but not others.
Full version
Contextualizing student engagement effect sizes: An empirical analysis
Rocconi, L., & Gonyea, R. M.
Association for Institutional Research Annual Forum, Denver, CO, 2015, May.
The concept of effect size?a measure of the strength of association between two variables?plays a crucial role in assessment, institutional research, and scholarly inquiry, where it is common with large sample sizes to find small or even trivial relationships or differences that are statistically significant. Using the distributions of effect sizes from the results of 984 institutions that participated in the National Survey of Student Engagement (NSSE) in 2013 and 2014, the authors empirically derived new recommendations for the interpretation of effect sizes which were grounded within the context of the survey. We argue for the adoption of new values for interpreting small, medium, and large effect sizes from statistical comparisons of NSSE Engagement Indicators, High-Impact Practices, and student engagement data more generally.
Full version
Does use of survey incentives degrade data quality?
Cole, J. S., Sarraf, S. A., & Wang, X.
Association for Institutional Research Annual Forum, Denver, CO, 2015, May.
Overall, this study found little evidence that survey incentives negatively affect data quality. Our analyses showed minimal differences between incentive and non-incentive groups with regard to straight-lining, item skipping, total missing items, and survey completion. Contradicting Barge and Gehlbach‘s finding, we found, in fact, that incentive respondents actually had better data quality than non-incentive respondents. Measurement invariance analysis also demonstrated that the presence of an incentive did not compromise the validity of NSSE Engagement Indicator scores or the underlying factor structures. The current study‘s findings with such a robust sample should allay any serious concerns NSSE users may have about incentives undermining data quality.
Full version
Using canonical correlation analysis to examine student engagement and learning
Zilvinskis, J., Masseria, A., & Pike, G. R.
Association for Institutional Research Annual Forum, Denver, CO, 2015, May.
Using canonical correlation analysis, this study examines the relationships between measures of student engagement from NSSE and perceived gains in learning. The study draws on institution-level data from NSSE participants in 2011 and 2013. Several significant relationships were found between engagement and learning. For example, learning outcomes associated with application, like acquiring job-related skills, were positively associated with the engagement indicators of quantitative reasoning and collaborative learning. This presentation also provides attendees with an introduction to the logic and methods underlying canonical correlation analysis.
Full version
What is the impact of smartphone optimization on long surveys?
Sarraf, S., Brooks, J., Cole, J., & Wang, X.
American Association for Public Opinion Research Annual Conference, Hollywood, FL, 2015, May.
Using results from a ten-institution experiment that used the 2015 National Survey of Student Engagement (NSSE), this study details the impact that smartphone optimization has on a survey with over 100 questions. The study's research questions center on how optimizing a survey for smartphones affects various data quality indicators including early abandonment, missing data, item nonresponse, duration, straight-lining, and subjective student evaluations. The study also investigates scale measurement invariance for NSSE's ten Engagement Indicators. Results indicate that the smartphone-optimized survey format outperforms both smartphone-unoptimized and desktop survey formats in several ways.
Full version
Presentations
First-Year Seminars: Evidence of HIP Qualities and Outcomes
Jillian Kinzie and Kevin Wenger
First Year Experience Annual Conference, 2022, February.
First Year Seminars (FYS) are positively associated with persistence and fostering student success. Yet, their content, form and outcomes can vary. This session first highlights research exploring FYS student's exposure to the eight essential dimensions theorized to define High-Impact Practices (HIPs) as a way to prompt enhancements to FYS quality, and then discusses the association between FYS and outcomes such as sense of belonging, intent to return and NSSE Engagement Indicators as evidence of the contribution of FYS. Join us to discuss the implications of this research and to give input on this new item set.
Full version
20 years of student engagement: Insights about students, assessment, and college quality
Kinzie, J., Gonyea, R., & McCormick, A.
Assessment Institute 2019, Indianapolis, IN, 2019, October.
In 2020 the National Survey of Student Engagement enters its third decade assessing the quality of undergraduate learning and success. In 20 years, the student engagement movement has surely changed our notions of quality in higher education. Most institutions now value a culture of evidence, promoting deep approaches to learning, developing high-impact practices, and tracking engagement indicators. This session reviews the most important findings about student engagement in the past two decades, and asks participants to consider what engagement will look like in the next decade. What is next for assessing quality in undergraduate education and collecting evidence for improvement?
Full version
Using NSSE results to inform campus plans to expand high-impact practices and assess impact
Kinzie, J., & Zilvinskis, J.
Assessment Institute, Indianapolis, IN, 2016, October.
Many campuses are planning to increase students‘ participation in high-impact practices. This session will explore how National Survey of Student Engagement (NSSE) institutional reports and annual results can be used to inform institutional efforts to plan, design, and assess HIPs, and will include considerations regarding entering students‘ expectations for HIPs, differences in participation by student
characteristics, and patterns of participation by major. Working through a case study, we will discuss how NSSE results can inform campus design and evaluation of HIPs.
Full version
The relationship of on-campus living with student engagement
Gonyea, R. M., Graham, P., & Fernandez, S.
ACUHO-I Conference, Orlando, FL, 2015, June.
On-campus living has traditionally been recognized as beneficial to the undergraduate student experience. However, as higher education and subsequently residence life evolve to meet the needs of a new generation of students, it is important to reassess the impact of on-campus living on student learning and development, seeking an understanding of successes and areas of underperformance. Using data from the 2013 and 2014 administrations of the National Survey of Student Engagement (NSSE), including over 300,000 first-year and senior students from 973 institutions, this session compares students who live on campus with their counterparts in other housing arrangements. We share initial findings related to NSSE‘s ten Engagement Indicators and other key engagement measures as we explore new research questions and focus areas for a report co-published with ACUHO-I in support of their research agenda in the fall of 2015.
Full version
The NSSE update: Analysis and design of ten new Engagement Indicators
BrckaLorenz, A., & Gonyea, R.
Indiana Association for Institutional Research Annual Conference, Indianapolis, IN, 2014, March.
Full version
NSSE Engagement Indicators: A conversation about transition and use
Rocconi, L. R., & Kinzie, J.
Southern Association for Institutional Research Conference, Memphis, TN, 2013, October.
Full version
NSSE Engagement Indicators: A conversation about transition and use
BrckaLorenz, A., & Gonyea, R.
Association for Institutional Research Annual Forum, Long Beach, CA, 2013, May.
With the update to the National Survey of Student
Engagement instrument in 2013, new measures of
engagement were rigorously tested to replace the historic Benchmarks of Effective Educational Practice. Participants discuss and compare the overall content of these new Engagement Indicators to see how the updated content has
been added, retained, or rearranged from the benchmarks. Participants discuss the challenge of longitudinal comparability of individual questions and indicators, and learn new ways to evaluate longitudinal questions. Discussion
focuses on three questions: What are the compositions and properties of NSSE‘s new Engagement Indicators? How do the Engagement Indicators relate to the NSSE Benchmarks?
How can institutions transition between using these two measures of engagement?
Annual Results
Oregon Institute of Technology: Exploring General Education and Learning Outcomes
In Engagement insights: Survey findings on the quality of undergraduate education—Annual results 2016, 4.
Full version
Institution-Level Correlations: Engagement, Retention, and Graduation
In Engagement insights: Survey findings on the quality of undergraduate education—Annual results 2015, 12.
Full version
Seniors? Post-Graduation Plans Influenced by Major and Participation in High-Impact Practices
In Engagement insights: Survey findings on the quality of undergraduate education—Annual results 2015, 6.
Full version
Engagement Indicators and High-Impact Practices: New Measures to Assess the Educational Experience
In A fresh look at student engagement—Annual results 2013, 8.
Full version
Introducing the Updated NSSE Survey for 2013
In Promoting student learning and institutional improvement: Lessons from NSSE at 13—Annual results 2012, 15.
Full version
Webinars
NSSE & Coronavirus 2020: Preliminary Analysis Results and Recommendations
Bob Gonyea, Brendan Dugan, Shimon Sarraf, and Kevin Fosnacht
May 18, 2020.
Recording
An Overview of the NSSE Summary Tables
Angie Miller
February 21, 2019.
Recording
Moving Beyond Statistical Significance: Using Effect Sizes in NSSE
Louis Rocconi, Research Analyst & Bob Gonyea, Associate Director
June 23, 2015.
Recording
The NSSE update: Analysis and design of ten new Engagement Indicators
Bob Gonyea and Allison BrckaLorenz
April 29, 2014.
Recording
Using NSSE data for accreditation
Jillian Kinzie and Cindy Ahonen
March 26, 2014.
Recording
Need more answers about NSSE 2013?
Jillian Kinzie, NSSE Associate Director, and Allison BrckaLorenz, FSSE Project Manager and NSSE Research Analyst
December 4, 2012.
Recording