NSSE Data User’s Guide

Your Guide to Using NSSE Data

The improvement efforts of colleges and universities are most promising when they are based on evidence of the performance and experience of their students inside and outside the classroom. In addition, institutions’ evidence of their achievements and of how they use data to inform improvement efforts is expected in their responses to heightened demands for accountability and multiple pressures to increase student persistence and completion, support diversity, and ensure high-quality learning for all students.

The National Survey of Student Engagement (NSSE) provides institutions with data and reports about critical dimensions of educational quality. Whether a campus is interested in assessing the amount of time and effort students put into their studies or the extent to which students utilize learning opportunities on campus, NSSE provides colleges and universities with diagnostic, actionable information that can inform efforts to improve the experience and outcomes of undergraduate education.

NSSE results can inform and structure conversations in efforts to enhance student learning and success across campus offices and projects including enrollment management and retention, marketing and communications, faculty development, learning support, and student housing. As an assessment instrument, NSSE can be used to identify both areas of strength as well as opportunities for growth to help make learning and the campus environment more cohesive with student needs and expectations.

Making NSSE data accessible and useful is key to engaging various campus audiences in identifying and analyzing institutional and program shortcomings and for developing targeted strategies for continuous improvement—critical steps in institutional growth and change. How can institutions determine who is interested in NSSE results? What are the best ways to connect campus groups and committees with this information? What audiences could use this information in responding to campus challenges and opportunities?

Simply reporting NSSE results will not lead to action.

Conceptual Approach

Getting good data, communicating what the data mean to invested parties, and using the data accordingly are critical steps in institutional change and achieving improved educational outcomes. The approach known as “double-loop learning” (Argyris & Schön, 1996) informs this work and involves the creation of “communities of practice” (Lave & Wenger, 1991)—practitioners engaged in dialogue to share experiences, identify problems, and learn with and from each other. It is through this kind of collaboration that community members can take ownership in institutional issues and work together to help institutions grow.

An early step in making the best use of NSSE data and reports is to revisit your institution’s rationale for participating in NSSE. What motivated your NSSE participation and what does the campus intend for the results? Knowing if your campus plans to use NSSE for accreditation, for routine assessment, or for student retention efforts is important todetermining where data are most relevant and to informing audiences eager to use the results.

Using NSSE data effectively also requires accurately interpreting the results and disseminating the interpretations along with the results to people who can do something about student engagement. Simply reporting NSSE results will not lead to action. Many institutions have found that sharing results at retreats, faculty workshops, first-year experience task force meetings, and other group gatherings is a productive way to stimulate interest and action.

Sharing NSSE Results

Fully participating in NSSE is more than simply joining in the survey administration and receiving the results. Disseminating NSSE results to relevant audiences and committees across campus is arguably one of the most important steps of NSSE participation. Consider the many opportunities on your campus for sharing NSSE results, including:

  • assessing institutional performance
  • monitoring academic standards
  • providing evidence for accountability and transparency
  • informing improvement efforts
  • monitoring students’ exposure to effective educational practices
  • supporting student learning and development
  • developing cohort experiences for groups of students
  • facilitating student retention and engagement
  • guiding staff development efforts
  • managing resources, programs, and services
  • fostering other stakeholder engagement
  • improving internal communication
  • marketing to prospective students
  • communicating with alumni

The most effective uses of NSSE results take into consideration how dissemination is most likely to enhance education policy and practice. This involves identifying the audiences and contexts that surround the reporting activities.

To facilitate campus presentations of NSSE results, participating institutions are encouraged to use the customizable powerpoint provided in their Institutional Report package.  The powerpoint template for NSSE 2020 is available to be viewed and downloaded. 

NSSE results make more sense when audiences have a basic understanding of the concept of student engagement. Research shows that engagement—the time and energy students devote to educationally purposeful activities—is the best single predictor of student learning and personal development. Higher levels of student engagement result from certain institutional practices, the best known set of which are the Seven Principles for Good Practice in Undergraduate Education (A. W. Chickering & F. Gamson [1987]. AAHE Bulletin, 39[7], 3–7):

  1. Encouraging student-faculty contact
  2. Developing cooperation among students
  3. Using active learning techniques
  4. Giving prompt feedback
  5. Emphasizing time on task
  6. Communicating high expectations
  7. Respecting diverse talents and ways of learning

Emphasizing good educational practice helps focus faculty, staff, students, and others on the tasks and activities associated with higher yields in desired student learning outcomes.

It is important to answer any questions that may arise regarding the validity and reliability of the NSSE survey before introducing the data and results to the workshop group. Staff may more readily accept the findings and consider changes to their practice if such questions are adequately addressed before the workshop begins.

The validity of self-reported data can be affected by the ability of respondents to provide accurate and truthful information in response to questions. Research shows that people generally tend to respond accurately on questions about their past behavior unless the questions are sensitive or make them uncomfortable. The validity of self-reported time estimates has also been examined. To provide survey respondents a frame of reference, NSSE items include specific periods of time to aid memory recall and to reduce the distortion that may occur when respondents remember events over time. Further research suggests that self-reported data are valid under five conditions, all of which NSSE was designed to satisfy:

  1. The requested information is known to respondents
  2. The questions are phrased clearly and unambiguously
  3. The questions refer to recent activities
  4. The respondents take the questions seriously
  5. The questions do not threaten, embarrass, or violate respondents’ privacy

The “halo effect”—which may account for satisfied students inflating performance, grades, or personal gains and efforts on surveys—appears to be fairly consistent across student populations. Thus, although what students report may differ somewhat from what they actually do, this effect does not appear to advantage or disadvantage one institution or student group compared with another.

Further information about research on self-reported data is in the Psychometric Portfolio. The portfolio provides a framework for presenting studies on the validity, reliability, and other indicators of quality of NSSE’s data, including analysis of data subsets defined by a variety of student and institutional characteristics.

When new NSSE users receive their results and reports, they may not know where to jump in, or they may first wonder, “What are we doing well?” As you review your NSSE reports from 2001 to 2012, we recommend starting with your Benchmark Comparison report. This will give you comparative information to look inside and across your results. Second, look at the results to individual questions that comprise each benchmark. Specifically, it may be helpful to review the items with the greatest frequency of “Very often” against “Never” responses. If you are reviewing NSSE reports from 2013 or later, we recommend starting with your Snapshot report. Sharing this report with a wide campus audience may be a good first step, but providing campus units or groups with more tailored results or conducting interactive presentations using some of the worksheets in this user’s guide may help stimulate interest in additional results. The tips displayed in the box below may help your data dissemination initiatives.

Educational improvement is the primary goal in using NSSE. If your NSSE results are less than favorable this year, sharing the data with appropriate institutional stakeholders is even more important. If results are not shared, campus administrators may remain in the dark on crucial educational issues. For example, if NSSE results reveal that students are not interacting with academic advisors or faculty members as frequently as administrators had hoped, sharing the NSSE results is an important step in starting the change process.

For campuses that seek to exploit NSSE to its full potential, receiving NSSE’s detailed reports and student data files is not the end of a process. Rather, it signals the beginning of the next phase: using NSSE results. After data collection has concluded, the real work begins— making meaning from the results, identifying priorities for action, formulating concrete plans for improvement, and implementing those plans. At whatever point in this process your campus may be, we encourage you to take full advantage of all that NSSE provides.

Communicating Results to Create Action for Change

  • Meet with key stakeholders individually or in a small group. Before meeting with stakeholders, be sure to send NSSE results ahead of time and ask them to bring a copy to the meeting.
  • Contextualize the data or compare them with previous years’ administration results. Consider what other institutional data you can link to NSSE data. 
  • Work with stakeholders, relevant committees, and/or departments to create specific goals and action plans using NSSE results. Make sure that students, individuals who work in this area, and campus representatives who may have an interest in NSSE results are included in communications. 
  • Implement the plan and monitor progress. Progress can be monitored via focus groups, informal surveys, and interviews.
  • Form an “action team” (faculty and staff) to spearhead administration and promotion and to help further analyze results. Host a lunch for a preliminary discussion of team members’ roles and expectations.

Overcoming Potential Obstacles to Using NSSE Results Effectively

Converting assessment information into action is a challenge for all colleges and universities.

Obstacle: Small number of respondents

Check various demographics of your NSSE respondent file to compare the representation of the sample and your campus population. Review the sampling error. In future administrations, consider ways to increase the number of respondents and promote survey participation.

Obstacle: Questions about validity and reliability

NSSE has conducted a number of studies to document the validity, reliability, and other indicators of quality of NSSE’s data, including analyses by various student and institutional characteristics.

Obstacle: Limited capacity to analyze and report results

The reports that NSSE sends institutions can be quickly packaged and sent to faculty and staff with little work. All data files, reports, and supporting documents related to NSSE Institutional Reports are available in electronic format, which allows for easier print or electronic distribution.

Obstacle: “Average” results across the board

Try using a different comparison group or consider a criterion-based approach to determine the degree to which student performance is consistent with institutional expectations. Analyze results by subgroups to reveal internal variation.

Obstacle: Lack of faculty awareness of or interest in learning about and using student engagement results

Consider administering the Faculty Survey of Student Engagement (FSSE) as a way to look at student engagement from the faculty perspective. Results may be useful for discussions at a retreat or workshop. Also, make available a summary of the literature on the value of effective educational practices.