Search again
Publications
Can Shorter Surveys Motivate Nonrespondents to Respond? A Randomized Controlled Experiment with College Students
Sarraf, Shimon
Atlanta, GA: , 2024.
During the spring 2023, NSSE conducted a randomized controlled experiment with nonrespondents to assess the impact of administering shorter surveys on response rates and question completion. This poster documents the experiment's major findings and concusions.
Full version
How important are high response rates for college surveys?
Fosnacht, K., Sarraf, S., Howe, E., & Peck, L. K.
The Review of Higher Education, 40(2), 245–265, 2017.
Surveys play an important role in understanding the higher education landscape. However, declining survey participation rates threaten this source of vital information and its perceived utility. Although survey researchers have long assumed that the best way to obtain unbiased estimates is to achieve a high response rate, many survey researchers have begun to question the widely held assumption that low response rates provide biased results. Due to the prevalence of survey data in higher education research and assessment efforts, it is imperative to better understand the relationship between response rates and data quality. This study investigates this assumption with college student assessment data. It utilizes data from hundreds of samples of first-year and senior students with relatively high response rates using a common assessment instrument with a standardized administration protocol. It investigates how population estimates would have changed if researchers put forth less effort when collecting data and achieved lower response rates and respondent counts.
Full version
An alternative approach: Using panels to survey college students
Sarraf, S. A., Hurtado, S., Houlemarde, M., & Wang, X.
AIR Professional File, (Fall), Article 138, 2016.
Eight short surveys based on select items from the National Survey of Student Engagement (NSSE) were administered to approximately five hundred students over a nine-week period at five diverse colleges and universities. The goal of the experiment was to investigate what impact a survey panel data collection approach would have on recruitment, survey data quality indicators, and scale properties. Results indicated higher response rates, shorter survey duration, and minimal impact on scale factor structures. However, both cost of incentives and panel member attrition make this alternative survey method less attractive than it would be otherwise.
Living with smartphones: Does completion device affect survey responses?
Lambert, A. D., & Miller, A. L.
Research in Higher Education, 56(2, Special Forum Issue), 166–177, 2015.
With the growing reliance on tablets and smartphones for internet access, understanding the effects of completion device on online survey responses becomes increasing important. This study uses data from the Strategic National Arts Alumni Project, a multi-institution online alumni survey designed to obtain knowledge of arts education, to explore the effects of what type of device (PC, Mac, tablet, or smartphone) a respondent uses has on his/her responses. Differences by device type in the characteristics of survey respondents, survey completion, time spent responding, willingness to answer complex and open-ended questions, and lengths of open-ended responses are discussed.
Full version
Scholarly Papers
Survey lottery incentives and institutional response rates: An exploratory analysis
Sarraf, S., & Cole, J.S.
Association for Institutional Research Annual Forum, Orlando, FL, 2014, May.
Many institutional and educational researchers are well aware that response rates for assessment surveys have been declining over the past few decades (Dey, 1997; Laguilles, Williams, & Saunders, 2011). As a result, many researchers have noted that our ability to adequately assess student academic experiences, satisfaction, engagement, use of campus resources, and other
important topics in higher education are at risk (Pike, 2008). Consequently, use of incentives are one tool that many institutional researchers have come to rely on to boost or hold steady their response rates for various campus student surveys. Though research regarding the efficacy of incentives to boost survey response rates in higher education is scant, the research that does exist suggests that incentives are an effective way to boost institutional response rates (Heerwegh, 2006; Laguilles, Williams, & Saunders, 2011). The purpose of this study is to investigate the efficacy of lottery incentives (the mostfrequently used incentive approach) to boost responses rates for institutions using the National Survey of Student Engagement (NSSE).
Full version
How important are high response rates for college surveys?
Fosnacht, K., Sarraf, S., Howe, E., & Peck, L.
Association for Institutional Research Annual Forum, Long Beach, CA, 2013, May.
How important are high survey response rates for estimating population statistics related to the college experience? Given a general decline in survey participation rates among college students, the answer to this question has broad implications for institutional researchers who often use surveys. Survey methodologists have found that low response rates do not necessarily bias results. This study tests this proposition using results from about 250 colleges and universities that administered NSSE. Findings indicate that survey population estimates based on simulated low response rates are very similar to those based on actual high response rates.
Full version
How much effort is needed? The importance of response rates for estimating undergraduate behaviors
Fosnacht, K., Sarraf, S., Howe, E., & Peck, L.
American Educational Research Association Annual Meeting, San Francisco, CA, 2013, April.
Survey methodologists have found that low response rates do not necessarily bias results. This study tests this proposition using results from several hundred colleges that administered a student survey. Findings indicate that survey population estimates based on low response rates are very similar to those based on high response rates.
Full version
Presentations
Understanding and Improving Survey Data Quality: Key Insights from NSSE
Sarraf, Shimon
Overseas Chinese Association for Institutional Research (OCAIR), 2021, February.
What has NSSE learned over the past ten years related to understanding and improving survey data quality? This presentation covers various topics, including the effect of survey incentives, learning management systems, and campus promotions on survey response, the importance of optimally formatting long surveys for small screen devices, the effectiveness of panel data collection, and how to evaluate response rates.
Full version
How are survey response rates changing? Findings from NSSE
Sarraf, S.
Association for Institutional Research Annual Forum, Denver, CO, 2019, May.
How much have survey response rates changed at colleges and universities over the last decade? Using a National Survey of Student Engagement longitudinal data set (2010 to 2018) based on approximately 1,000 institutions, this study investigates how much response rates have changed over the years, the degree of variability in response rates within any given year, and what factors influence these outcomes, such as school undergraduate enrollment and the use of survey incentives. While reviewing this poster presentation, attendees will also gain an appreciation for one statistical method well suited for understanding change over time-latent growth curve modeling.
Full version
Assessing small populations: Recognizing everyone counts in your counts
BrckaLorenz, A., & Hurtado, S.
Student Affairs Assessment and Research Conference, Columbus, OH, 2018, June.
Quantitative and survey research depends heavily on large sample sizes, but a focus on the ?average student? in quantitative analyses often hides diverse voices. Participants in this session will discuss common issues and solutions associated with giving voice to small populations of college students (e.g., gender variant, multiracial, LGBQ+). Participants will discuss administration issues related to small populations such as increasing response rates, identifying special subpopulations, and writing more inclusive survey questions. Tips for disaggregating, responsibly aggregating, and choosing inclusive comparative information will be provided. Additionally, participants will discuss strategies for analyzing and communicating about the results from small populations as well as approaches for communicating about the validity and data quality from small sample sizes.
Full version
Assessing small populations: Recognizing everyone counts in your counts
BrckaLorenz, A., Fassett, K., & Hurtado, S.
Association for Institutional Research Annual Forum, Orlando, FL, 2018, May.
Quantitative and survey research depends heavily on large sample sizes, but there are a variety of reasons why larger sample sizes may not be possible. Participants in this presentation will discuss common issues and solutions associated with assessing small populations of college students and instructors. Examples will focus on the experiences of gender variant and LGBQ+ students and faculty. Participants will also learn about and discuss administration issues related to small populations such as increasing response rates and identifying special subpopulations. Next, participants will learn about and discuss strategies for analyzing and communicating the results from small populations. Finally, participants will learn about and discuss approaches for communicating the validity and data quality from small sample sizes.
Improving survey data quality through experimentation
Sarraf, S., & Cole, J.
Association for Institutional Research Annual Forum, Washington, DC, 2017, May.
With help from a large and diverse group of colleges and universities over the past decade, the National Survey of Student Engagement has conducted various randomized experiments aimed at improving survey response rates and minimizing missing data. This session provides an overview of this effort and a summary of major findings. Attendees will gain a better understanding of various issues that could help with developing their own surveys, including smartphone optimization, email subject lines, progress indicators, survey page length, and using learning management systems for recruitment.
Full version
Improving survey data quality through experimentation
Sarraf, S., & Cole, J.
North East Association for Institutional Research Annual Conference, Baltimore, MD, 2016, November.
With help from a large and diverse group of colleges and universities over the past decade, the National Survey of Student Engagement has conducted various randomized experiments aimed at improving survey response rates and minimizing missing data. This session provides an overview of this effort and a summary of major findings. Attendees will gain a better understanding of various issues that could help with developing their own surveys, including smartphone optimization, email subject lines, progress indicators, survey page length, and using learning management systems for recruitment.
Collecting, analyzing, and reporting on data from small populations
BrckaLorenz, A., Hurtado, S., & Nelson Laird, T.
Association for Institutional Research Annual Forum, New Orleans, LA, 2016, June.
Quantitative and survey research depends heavily on large sample sizes, but there are a variety of reasons why larger sample sizes may not be possible. Participants in this session will discuss common issues and solutions associated with assessing small populations of college students and instructors, with considerations for special subpopulations (gender variant, multiracial, etc.) as well as considerations for small institutions. Participants will also learn about and discuss administration issues related to small populations such as increasing response rates and identifying special subpopulations. Next, participants will learn about and discuss strategies for analyzing and communicating about the results from small populations. Finally, participants will learn about and discuss approaches for communicating the validity and data quality from small sample sizes.
Full version
Analysis of student-level NSSE response rates based on pre-college engagement
Gonyea, R. M., & Korkmaz, A.
Association for Institutional Research Annual Forum, Seattle, WA, 2008, May.
Annual Results
No data for this resultWebinars
NSSE & Coronavirus 2020: Preliminary Analysis Results and Recommendations
Bob Gonyea, Brendan Dugan, Shimon Sarraf, and Kevin Fosnacht
May 18, 2020.
Recording
NSSE & FSSE 2020: Guidance on COVID-19 Disruptions
Alex McCormick, Jillian Kinzie, Bob Gonyea, Allison BrckaLorenz, Shimon Sarraf and Jennifer Brooks, NSSE and FSSE Project Staff
March 26, 2020.
Recording
Promoting in a Crunch: Increasing NSSE Response Rates
Dajanae Palmer & Bridgette Holmes
February 3, 2020.
Recording
Survey Recruitment Using Student Portals and Learning Management Systems
Shimon Sarraf, Assitant Director for Survey Operations and Project Services
September 29, 2016.
Recording
Improving student participation rates : What we?ve learned about incentives and promotion
Shimon Sarraf, Assistant Director
October 2, 2014.
Recording
Encouraging student participation in NSSE
Brian McGowan, NSSE Institute Project Associate
February 28, 2012.
Recording
Improving student response rates
Tony Ribera and Brian McGowan
January 25, 2011.
Recording