Maximizing Your Number of Respondents Effectively and Ethically
Total survey completions, respondent representativeness, and response rates are important measures of data quality that can increase confidence in survey results. To maximize response, NSSE works with institutions to personalize survey recruitment messages, emphasizing the survey's value for institutional improvement. Efforts to maximize response using a variety of promotional techniques can effectively complement the customary recruitment messages NSSE delivers. For example, employing learning management systems to post survey links has been found to boost response rates substantially at some NSSE participating schools. NSSE also encourages schools to coordinate their survey administration with other institutional survey initiatives as students are more likely to complete surveys when they are not suffering from survey fatigue. While determining the best way to increase response, institutions should always remember to respect students’ decision not to participate. These are but a few things to consider when trying to increase survey response in the most effective and ethical way. The remainder of this page reviews other helpful ideas and practices that institutions should consider before embarking on a successful NSSE administration.
First Step: Consider Your Campus Culture & Acceptable Practices
Each institution needs to assess their campus culture to determine the optimal methods for reaching their students; there is no one correct way to increase student participation. Dillman (2007) indicates several important factors that contribute to higher response rates:
- Student's perception of the survey's importance
- Student interest in the survey's content
- Fostering respondent trust
- Increasing perception of rewards for participation
- Decreasing perceptions of respondent burden and survey fatigue
The above factors are worth thinking about and addressing in survey promotion. Of course, survey publicity such as flyers and media articles as well as participation incentives can help send a message to the whole campus that the data are valuable for institutional improvement. We outline suggestions for increasing response rates below, as well as practices that should not be used because they can result in undue influence on participation.
Acceptable Survey Promotion Practices
- Customizing survey invitation messages (Cook, Heath, & Thompson, 2000)
- Including small guaranteed incentives and lotteries in contact materials (Millar & Dillman, 2011; Laguilles, Williams, & Saunders, 2011; Sarraf, S. & Cole, J.S., 2014)
- Using various promotional materials, such as flyers, press releases, and videos (Sarraf, S. & Cole, J.S., 2014)
- Informing students that survey responses will be kept confidential (Dillman, Singer, Clark, and Treat, 1996)
- Multiple reminders to complete the survey (Dillman, 2007; Schaefer & Dillman, 1998)
Empirical research using college students and other populations indicate the effectiveness of incentives. Some NSSE schools do report that incentives appear to encourage student response, and published controlled experimental research suggests the effectiveness of incentives with a college student population as well (Laguilles, Williams & Saunders, 2011). NSSE’s own research supports this conclusion (Sarraf & Cole, 2014).
Planning on using an incentive for your NSSE administration? If so, you will be asked to include information in recruitment messages and other promotional materials that will help a prospective respondent determine their odds of winning and the value of the prize(s) (e.g., “There will be drawings for two iPod Nanos ($149 value) and two $100 bookstore gift cards. About 400 students have been invited to participate in NSSE.”).
Unacceptable Survey Promotion Practices (per NSSE's IRB protocol)
- More than eight direct requests to participate in the survey
- Direct and personal contact by campus officials encouraging individual student response
- Revoking rights or privileges for non-response (e.g., blocking course or housing registration)
- Improperly describing NSSE as an anonymous survey. (For the vast majority of institutions, identified student responses are provided to institutional staff, so survey responses are not anonymous.)
Institutions' Survey Promotions
To help you develop your own NSSE promotional materials for your campus, we have assembled examples of how colleges and universities have promoted the NSSE survey on campus, including a variety of activities:
- Posting flyers on campus
- Sending a press release to student and local newspapers
- Submitting an article, or placing an ad in the student newspaper
- Submitting an editorial to the student newspaper written by a student or administrator, explaining why the survey is important, the purpose, how the survey will be used and that the survey is sponsored by the institution
- Posting notices on school websites or learning management systems
- Creating a Facebook group or Twitter account for sharing results and updating students
- Placing signs on campus buses or along local transportation routes.
- Setting up tables in the student center or union with survey information
Ethical Considerations in Survey Research
NSSE is both an institutional improvement effort and a research project, and as such, student participation in NSSE must be fully voluntary. Participants in research studies should always be informed of their rights relative to their participation and should know any potential risks of involvement in the study. NSSE provides this information in its contacts with students (see the informed consent statement). Additional efforts you make to increase response rates should never cause students to feel they will be penalized for not participating.
The Belmont Report (1979) established guidelines that led to the creation of Institutional Review Boards (IRB) or Internal Review Boards (IRB) to regulate research involving human subjects. This report outlined three considerations for beneficence in research: (a) maximize possible benefits, (b) minimize possible harms, and (c) equitably distribute the risks and rewards of research.
Inappropriate efforts to increase participation in research fall into two general categories: coercion and undue influence. Coercive interactions are those that imply directly or indirectly that a potential participant might lose rights or privileges for not participating in the study. Explicit examples of coercive measures include requiring students to complete the survey in order to register for classes or graduate. Implicit examples include the use of language such as “you must complete this survey” or “students with real NSSEville State University pride would complete this survey.”
Undue influence occurs when the incentives used to increase participation become the primary reason why students participate, or when the number or nature of attempts to encourage participation are excessive. Students should not perceive that the compensation is so great that they participate because they need the incentive, regardless of a sense that completing the survey would be a heavy burden to them.
Small incentives provided to each student who completes the survey are generally considered ideal. Larger incentives such as lotteries that draw prize winners from all participants may also be acceptable, but publicity for these efforts should include an estimate of the odds of winning. Survey promotion for these incentives should not emphasize the prizes to a degree that minimizes the requirement of survey participation.
Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in web or internet-based surveys. Educational and Psychological Measurement, 60 (6), 821-836.
Dillman, D. A. (2007). Mail and Internet Surveys: The Tailored Design Method (2nd ed.). Hoboken, NJ: John Wiley & Sons.
Dillman, D. A. Singer, E., Clark, J. R., & Treat, J. B. (1996). Effects of benefits appeals, mandatory appeals, and variations in statements of confidentiality on completion rates for census questionnaires. Public Opinion Quarterly, 60, 376-389.
Laguilles, J. S., Williams, E. A., & Saunders, D. B. (2011). Can lottery incentives boost web survey response rates?: Findings from four experiments? Research in Higher Education, 52,( 537-553.
Millar, M. M., & Dillman, D. A. (2011). Improving response to web and mixed-mode surveys. Public Opinion Quarterly, 75, 249-269.
National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research (1979, April 18). The Belmont Report: Ethical principles and guidelines for the protection of human subject of research.( Washington, DC: Department of Health, Education, and Welfare. http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html
Sarraf, S., & Cole, J. S. (2014). Survey lottery incentives and institutional response rates: An exploratory analysis. Paper presented at the annual forum of the Association for Institutional Research, Orlando, FL.
Schaefer, D. R., & Dillman, D. A. (1998). Development of standard e-mail methodology: Results of an experiment. Public Opinion Quarterly, 62((3), 378-397.