How Do the Changes Affect Comparisons with Prior-to-2013 Results?
Even the best surveys must be periodically revised and updated, affecting multi-year analyses such as trend studies or pre-post designs. Although many items remained unchanged, others were modified and a few were dropped, limiting longitudinal comparability of individual questions and historical benchmarks.
While some new results are not directly comparable to pre-2013 results, institutions can still evaluate longitudinal questions. For instance, if previous comparison group results indicated above average performance in a particular area, institutions can still gauge whether they outperform the same or a similar comparison group.
We are confident that these updates enhanced NSSE's value to institutions. NSSE will continue to provide useful resources and work with participating institutions to ensure maximum benefit from survey participation.
How Have Institutions Transitioned to the Updated NSSE?
Early lessons about using results from the updated survey indicate that redesigned reports, including the Snapshot, are easy to interpret and share, the new Engagement Indicators provide concise summary measures, and results offer institutions new insights into effective practice and opportunities for improvement. A brief summary highlighting how four institutions used results from the updated survey is described in Lessons from the Field: Transitioning to the Updated NSSE.
Why Update NSSE?
After a decade in the field, we knew more about what matters to student success, institutional improvement efforts, and properties of the NSSE survey itself. Moreover, as higher education faces increasing demands for assessment data, NSSE must stay relevant to current issues and concerns.
Long-time NSSE participants may recall that NSSE was updated regularly in the early years. Starting in 2005, however, we kept the survey largely unchanged as a practical matter for institutions to facilitate year-to-year comparisons. Our intention has been to roll out major updates at longer-term intervals, as we did in 2013. This approach balances the need of institutions to have year-to-year comparisons with NSSE’s need to respond periodically to changes in the higher education landscape, informed by a methodical research and development process.
Development of the Updated Survey
The multi-year process of planning and testing was completed in 2013. This process was scholarly, rigorous, and collaborative, and involved:
- Consulting with campus users and a variety of field experts
- Gathering ideas and feedback from other interested partners
- Examining NSSE’s psychometric properties, including six years of experimental items
- Conducting cognitive interviews and focus groups with students
- Pilot testing in 2011 and 2012.
- Theupdated NSSE instrument was unveiled at the AIR Forum June 4, 2012