The Spring 2020 disruption caused by the coronavirus affected every aspect of undergraduate education, including our best laid assessment plans. Rapidly changed circumstances prompted the addition of an asterisk to information associated with the spring term, including semester grade records, end of course evaluations, licensure exam scores, and assessment data, among others. Adding lengthy, descriptive footnotes to reports and files is important for good data documentation.
Although the COVID-19 footnote should explain our extremely changed circumstances, it would be a mistake to consider information collected during this period unusable, incomparable, or simply a break in the trend line. Rather, information should be examined for what the disruption revealed about our institutions or in our students' learning experiences and how this information could be used to inform educational design moving forward.
NSSE 2020 data provide particular insights from pre- and post-coronavirus disruption respondents. While the majority of colleges and universities participating in NSSE 2020 had most of their student responses recorded prior to the disruption, some students responded after the disruption, and institutions with late administration schedules are realizing more post-disruption responses. There is no denying that most post-covid learning experiences were quite different. Yet, since NSSE generally asks about experiences "during the current school year" respondents' comprehension, retrieval, and judgment about survey questions were on a broader time frame. This likely explains why our analysis of the coronavirus disruption on response rates and engagement results showed minimal impact, in particular the effect on Engagement Indicator scores appears negligible. While this analysis provides some assurance of data quality and continuity of results, it would be a missed opportunity not to mine NSSE 2020 results for insights about what the disruption revealed in undergraduate education.
Here are just a few ideas to stimulate your thinking:
Compare Pre- and Post-Disruption Responses to Understand What Your Data Represent. Assuming you have enough respondents in both categories, this analysis allows your campus to understand who and what your NSSE 2020 data represent. Explore if your descriptive response patterns align with our preliminary analysis findings, and if there are pre-post differences in your Engagement Indicator scores. How do your EIs compare to previous NSSE administrations? If different, is it appropriate to attribute this to the coronavirus disruption or does it reflect other institutional changes? Doing this comparison and documenting findings and interpretations will be important to assure that spring data are used appropriately for assessment and improvement.
Focus on Post-Disruption Responses for Insights into Remote Education. Who are your post-disruption respondents and what can you learn about their engagement and the shift to a remote experience? What is the relationship between their engagement results and the intent to return variable, or their final grades and other outcome assessments? What sentiments did these students express in their comments at the end of the survey? Drops in engagement scores may suggest aspects of the educational experience most challenged in the radical shift to remote learning, and looking within this population might expose the challenges experienced by students in certain majors or among student populations. While our analysis shows minimal impact, you may find differences among your students that reveal the impact of the change itself (having to move home, adjusting to remote learning, getting access to resources, changed expectations, etc.). What might this suggest for your campuses future plans for a hybrid or remote undergraduate experience?
Highlight Institutional Strengths to Inform Planning for Academic Year 2020-21. Your strongest performing items and highest Engagement Indicators in NSSE 2020 results could be viewed as a reminder about what your institution does well. Let these data highlight and point to aspects of your undergraduate experience that should be attended to as your campus plans for a modified Academic Year 2020-21. For example, if particular aspects of student-faculty interaction or collaborative learning strategies among students are a proven strength at your campus, consider what might need to be bolstered or reimagined to assure the notable aspects of your campus culture can continue.
To help with your planning, we will post NSSE 2020 raw data files to the Institution Interface around mid July. With your data in hand, consider using respondents' time stamp or your "covid" variable based on your disruption date (if provided; if not provided the default date was 3/23/20) to compare results before and after the disruption and in relation to findings in NSSE's preliminary analysis.
We're planning a webinar in early August to encourage NSSE 2020 data analysis and hope to feature an exchange of ideas among NSSE users. Please contact me at Jikinzie@indiana.edu if you have an analytical insight or approach to share, or a question to pose for this webinar.
We are eager to help you consider how your NSSE data contributes to your ongoing assessment plans and how it can help you study the impact of disruption to inform future planning.