Lessons From the Field is a growing repository of practical ideas for NSSE institutions' evidence-based assessment and improvement initiatives. Read more than 120 data use stories in volumes 1-4 via links to the full publication, the most recent examples in the Lessons from the Field Dispatches, and in the institution examples below.
Institution Examples by Publication:
Lessons from the Field - Dispatch #4
Image courtesy of the University of West Florida
UWF Uses NSSE Data to Design a Self-Study on HIP Participation Barriers
Involving more students in high-impact practices (HIPs) is a university-wide commitment at the University of West Florida. Assistant Professors Dr. April Schantz and Dr. Holley Handley’s recent collaboration used data from the National Survey for Student Engagement (NSSE) for an intra-institutional assessment of the barriers preventing students at The University of West Florida (UWF) from participating in HIPs. With the support of their HIP liaisons, UWF actively works to break silos and communicate practices across each college.
Assistant Professor Dr. Holley Handley serves as a HIP liaison for the College of Education.
“We are the communicator, the connector between what's going on at each department level in the university to try to share experiences, to crowdsource, and help faculty learn from each other to truly make high-impact practices institutionalized at our university,” said Handley. Together, their HIP liaisons have developed a HIP faculty toolkit, workshops, book clubs, and newsletters as part of their efforts. The HIP Barriers self-study initiative began from a university-wide commitment to high-impact practices and a project to connect UWF students with internships.
Schantz recalled, “We made internships available, and we promoted, and we got the word out through the advisors, flyers, and everything. And then we ended up with very little participation. And we're like, what's going on here? So, we started with some focus groups just asking the students what has been their experience with the internship? And one of the responses was, “Internship? There was an internship?”. Like, okay, somewhere, we're missing the conversation and missing that connection with students. But what are the other barriers students are facing?”
The focus groups illuminated barriers such as time and financial constraints, and awareness of what a HIP is and where and how to engage with them.
“We wanted to scale up and incorporate the information we do have, and this is all part of the organization or the institution's goals and commitment to high-impact practices overall. In fact, we have a whole team of high-impact practices liaisons and representatives from each college, who helped guide and serve as kind of an advisory board and support for us,” Schantz said.
Part of that available information was their NSSE data. NSSE provides participating institutions with student engagement data that can provide benchmarks with peer institutions of similar size and Carnegie classification. However, what UWF needed was a way to look inward. Handley and Schantz designed an instrument to assess their performance on high-impact practices. The project investigated UWF’s student engagement performance across three levels:
Compared to expectations according to NSSE.
Differentially across college units in the institution.
According to students' behaviors and perspectives.
“This was a full institution project collaboration to bring in the information from our students from our colleges within the institution and compare it to the NSSE data,” Schantz said. Schantz and Handley’s instrument collected qualitative and quantitative data from students across all class levels at UWF. They also asked questions to align student responses to their respective colleges.
“That's what enabled us to get the summary statistics per college within the institution. This gave us great insight – both in comparing the NSSE data to our overall college – but also comparing within colleges the information on which HIP types were most frequently used within each college as a function of the student’s program. So, this was great information,” Schantz said.
The qualitative items provided information to identify accessibility issues and relevant barriers to student engagement and high-impact practices. Using NSSE data from the prior year's administration, they compared the results from the internal data. With this research design, they compared their prior-year NSSE administration data to the internal data collection to look for exemplars within the institution. What they found was:
Overall, they were doing well on collaborative projects, e-portfolios, internships, and intensive writing across the whole institution.
The College of Arts and Social Sciences significantly engages in HIP types, including undergraduate research, e-portfolios, internships, capstones, and intensive writing.
The College of Business had the most success in promoting engagement of diversity in study abroad compared to all the other colleges.
The College of Sciences excelled in promoting engagement of the First Year Experiences and the learning communities.
The College of Health had the highest service and community learning engagement across all colleges.
“So all of the colleges showed their favorite HIP types. And if we're able to use that or leverage it at an institutional level, we can get those exemplars to help our other areas see their opportunities,” Schantz said.
To look at student behaviors and perspectives of how they engage in high-impact practices, the team used latent class analysis to identify four patterns of HIP engagement and find areas of opportunities. The richness of their data collection provides their project with many avenues for continuation and improving HIP participation at UWF. Schantz and Handley anticipate the next step is to evolve their assessment strategy toward the quality of high-impact practices rather than just the quantity of what's offered.
“We still have some opportunities because less than half of the students participated in any of the HIP types. So even though we have all this support from the HIP liaisons in our institution and from faculty and everything, we have opportunities. That was really good information to have,” said Schantz.
For more information or questions about this project, Schantz and Handley can be reached at aschantz@uwf.edu and hhandley@uwf.edu.
Amplifying Student Voice in Assessment
Getting students involved and invested in promoting, interpreting, and disseminating National Survey of Student Engagement (NSSE) data is a proven strategy for deepening the impact of results.
Lindenwood University’s Office of Academic Effectiveness created a student internship, the Student Assessment Scholars program, to increase student involvement in assessment. The Student Assessment Scholars program uses a real-world approach to institutional assessment that promotes student-led assessment research (Elder et al., 2023).
The Student Assessment Scholars is a yearlong program consisting of a credit-bearing fall internship focused on developing students’ research skills. The program introduces students to the complete life cycle of a research or assessment project from beginning to end – including completing CITI training to work with human subjects and seeking IRB approval. Campus stakeholders then apply to work with a team of student assessment scholars to elicit student input about their issue or program.
In the spring semester, student assessment scholars complete a 2-credit internship. During this time, they conduct focus groups and interviews, analyze the data for important themes, and write results in an insight report, presentation, or poster. Scholars have worked on projects examining involvement and engagement at Lindenwood, general education revision, and student perceptions of humanities.
In 2022-2023, Lindenwood’s Office of Institutional Research worked with the program to contextualize and disseminate NSSE results on campus. The emphasis on NSSE explicitly offers the chance to improve faculty and students' understanding of student engagement and how it is realized at Lindenwood. The scholars’ project educated students on NSSE’s content and terminology, encouraged participation, and provided general information on Lindenwood’s past NSSE participation. The scholars presented their findings at Elevate Leadership, a student leadership program. They also reviewed student comments and historical NSSE results and helped create a website to disseminate results.
Lindenwood University's student assessment scholars highlight the value of student voice in planning, conducting, and sharing assessments. The program demonstrates how engaging undergraduate students in assessment projects allows for more authentic and informed action and creates opportunities to participate in multiple high-impact practices. Lindenwood University provides students an engaging opportunity to participate in undergraduate research, have collaborative experiences with students and faculty, and participate in credit-bearing internships.
References:
Elder, R., Shilling, A., Waters, M. October 2023. Student Voice in Assessment: Collaborating with Student Assessment Scholars to Gain Feedback and Change How NSSE Data are Shared. Assessment Institute, Indianapolis, IN.
Memorial University (Newfoundland Canada) NSSE 2020 Video
Sharing NSSE Results in a Video
Memorial University in Newfoundland Canada created an attractive and informative video introducing NSSE and the interpretation of their NSSE 2020 results, focusing on data that captured the attention of the ad-hoc committee of faculty and staff who were formed to review and interpret their 2020 results.
Memorial participates in NSSE every three years, having done so for the first time in 2008, and shares highlights of its results and a report from the NSSE Review Committee Fall 2020 on the Centre for Institutional Analysis and Planning website.
The ad-hoc committee review of results emphasized engagement as a priority in the institution’s Teaching and Learning Framework and goals for their learner-centered culture. With a 38% response rate, the committee felt confident identifying Memorial's successes and areas for improvement, while simultaneously considering how the results align with the priorities in their Teaching and Learning Framework. The committee found it was most meaningful to use the Canadian Comprehensive Universities as a comparison group, since these universities hold similar characteristics to that of Memorial.
The committee identified notable successes including: students' overall satisfaction; the quality of their interactions with faculty; and their experiences with academic advising and learning support services. For example, Memorial students are more satisfied with their experiences than students in the comparison group, with 80% of first-years and 83% of fourth-year students rating their educational experience as "excellent" or "good". They also report a higher quality of interaction with faculty and with academic advisors than students in the comparison group.
While it's important to acknowledge and celebrate successes it's even more important to understand where there is room to grow. Three areas to improve engagement include: high-impact practices (HIPs); first-year experiences; and diversity. In 2020 NSSE data, HIPs appear relatively strong for fourth year students, but less so for first-year students. These results suggest that incorporating HIPs into first-year courses and curricula could be a priority. Memorial's first-year students appear to be working less with their classmates on things like team projects and assignments, and participating less in classroom discussions. These results suggest that first-year students may require greater support to engage meaningfully with their peers, and that instructional practices should be designed to help students transform their course preparation into active learning, promoting opportunities for greater participation, interaction and cooperation. A third area of focus to improve student engagement at Memorial is diversity. Students report having fewer diverse interactions, yet the majority indicate that their university experiences have contributed to their knowledge, skills and personal development in understanding people of other backgrounds. Given this information, it would be valuable to further investigate curricular strategies that support students' experiences with and understandings of diversity, and to create meaningful opportunities for more students to work and engage with diverse others.
NSSE results provide Memorial with a gauge for success and a helpful guide for strengthening students’ educational experience.
Sharing NSSE Data Widely with Support from the Institute for Teaching and Learning
Youngstown State University (YSU), in Youngstown, Ohio, has participated in NSSE for many years, and it allows the university to view trends in student perceptions and behavior. NSSE data points are used as Key Performance Indicators to the Board of Trustees for their university strategic plan. Many of their co-curricular units utilize NSSE data as part of their assessment reporting, as well as some of the academic departments/colleges utilizing NSSE data for both accreditation and assessment reports.
A multi-year analysis of NSSE data, particularly around Learning Strategies and Effective Teaching Practices, in combination with data from other student surveys was a key part of the rationale for the creation of YSU’s Institute for Teaching and Learning. From 2013 to 2018, YSU saw downwards trends in Learning Strategies and Effective Teaching Practices Engagement Indicators for both first-year and senior students (at the same time their incoming student body had higher GPA and ACT scores). Relevant results from the YSU’s 2018 Faculty Survey of Student Engagement (FSSE) revealed faculty reporting engagement in continuous improvement activities at lower rates than desired. In 2019, a team of stakeholders visited academic departments on campus to explore these trends in more detail and find out what faculty needed in terms of institutional support for effective teaching. Utilizing this faculty feedback, the Institute for Teaching and Learning was launched, bringing together assessment of student learning and faculty development initiatives on campus. Since 2019, the Institute for Teaching and Learning has supported 838 unique faculty, staff, administrators, and students on YSU’s campus through workshops and consultation. Importantly, NSSE and FSSE 2021 results are already showing improvements in some indicators that YSU believes point to the impact of some of the Institute’s programming and services. For example, FSSE results indicate that faculty are spending more time on active learning-type activities over lecture.
Maximizing NSSE Data Use
YSU’s uses their NSSE data in a variety of other ways to engage with the campus community. Some of these examples include:
As a Key Performance Indicator (KPI) to their Board of Trustees, they report multi-year data on seniors rating of their entire educational experience, as well as all students’ responses to the “Would you go to the same institution?” question.
In 2021, they specifically looked at data from 2018 to 2021, to further analyze the ‘belonging series of questions.’ Prior work in this area using the Inclusiveness module had been conducted in 2018 which generated data that was used to explore trends prior to and during COVID and discuss ways to help students feel valued and a part of the campus community after remote and hybrid instruction. These data were shared with various stakeholders, from campus leadership to student leaders, to discuss strategies for building community. In both years, they also disaggregated the data by various student characteristics. This past spring a group of graduate students worked on an additional survey for Student Veterans to explore their campus belonging, since Student Veterans rated their perception of value and belonging lower than the campus average.
YSU recently launched a new first-year seminar course, YSU 1500. They matched NSSE data with data from their Division of Student Success to begin exploring the impact of first-year student participation in that course.
In 2021, YSU also participated in the Academic Advising module under their regular NSSE administration and saw first-year students responding positively to their perceptions of advising compared to YSU seniors and first-year students at peer institutions. Disaggregated NSSE data and module data were used to triangulate retention data and direct feedback collected by the Division of Student Success.
YSU’s Institute for Teaching and Learning also offered department and college disaggregation of NSSE data for any campus stakeholder who requests it for accreditation purposes. Recently, their office provided disaggregated NSSE data to the Williamson College of Business to support their work for the Association to Advance Collegiate Schools of Business (AACSB) accreditation. Institutionally they are accredited through the Higher Learning Commission and use NSSE data at various points as evidence in their assurance argument.
Sharing NSSE Results More Widely
YSU makes their NSSE data and an array of supporting resources available on their website. The institution’s NSSE Snapshot report and results from two Topical Modules – Academic Advising and Inclusiveness and Engagement with Cultural Diversity – are summarized in a report titled “Key Module Takeaways”. In an effort to continue disseminating NSSE data and results across campus and engaging various stakeholders, they created a webinar which was shared online to supplement these documents. The Institute for Teaching and Learning also encouraged people to review the webinar and snapshot before attending their spring Lunch & Learn Data Conversations which was open to the entire campus community. Internally, they also created data reports/summaries which have been shared with Student Government Association, Campus Leadership, Deans, Division of Student Success, First-Year Student Services, Athletics, Veterans Affairs, Housing, Senate Teaching and Learning Committee, Student Organization Leaders, Student Presidential Mentors, and the Division of Student Affairs. They also created disaggregated reports as requested by colleagues and departments.
The Institute for Teaching and Learning staff is primarily responsible for the dissemination of NSSE data on campus. They actively disseminate to the campus stakeholder groups through tailored data, presentations, and reports. Overall, YSU’s continued efforts to reduce gatekeeping of NSSE data and finding creative ways to share this information with relevant partners across campus continues to prove beneficial.
Using NSSE to Increase Conversations Among Campus Partners
Marian University has continued to use NSSE data in a variety of ways. Recently, we connected with Dr. Tony Ribera, Director of Educational Assessment, who was able to share more about their ways of sharing data with the entire campus. Marian had done quite a bit with their data from the past administration (See Lessons from the Field Dispatch #2) and in the last year it was more intentional using a comprehensive lens to look at everything that was happening on their campus. They held their first roundtable data share which was a half day event to analyze and understand all the data they had gathered from the NSSE administration.
Looking at various themes in comparison to their peer institution really allowed for discussions around what needed to happen on their campus and the areas that they were lacking in. Dr. Ribera also shared that they already had an event called the ‘Assessment showcase,’ dedicated to faculty presenting their research and scholarship for the grants they receive for teaching and learning. This year they decided to combine this event and redesigned it as ‘an assessment day,’ where the first half of the day was dedicated towards the various presentations for faculty and latter half of the day involved bringing various stakeholders together for a roundtable discussion.
“We setup an internal Canvas shell and gave access to relevant stakeholders on campus, so they could access the various reports and data before we had the roundtable data share meeting,” said, Dr. Ribera. The Canvas shell was designed three weeks in advance of this meeting, which allowed enough time for all participants across campus to review the reports and data before the meeting. They also utilized the discussion feature on Canvas to allow for conversations and interactions prior to their meeting. At the campus event, they began by discussing the general overview and then divided everyone into smaller groups and discussed areas that had their strengths and needed improvements as well the ideas that were shared through the online Canvas shell. The in-person meeting brough together over 40 faculty, staff members, alumni and students from their campus.
Marian also administered the inclusiveness and engagement with diversity module and participated in the Catholic Colleges and Universities Consortium both of which yielded data that contributed towards a rich conversation at the roundtable meeting. Last year, Marian University also joined the higher education data sharing consortium which provided additional sources of data comparisons along with the data reports that their institutional research office was able to provide internally. “Marian is very learning centered and really just wants to do the best for students.” Said Dr. Ribera who also provided an analysis of quantitative data that his office was able to create and identified themes that appeared to be like the findings of their 2019 data. Using data from NSSE in different ways to engage campus partners is an effective practice and Marian University continues to demonstrate this effort through their Center for Teaching and Learning.
Mapping HIP Participation
Clemson University is committed to supporting students' participation in High-Impact Practices (HIPs). Students have a wide variety of opportunities that provide real-world, hands-on, problem-solving experiences, such as Creative Inquiry, ClemsonThinks2, service-learning, cooperative education, the University Professional Internship Program, among others. In the institution’s current 10-year strategic plan, ClemsonForward, they outline a plan to continue these opportunities while also integrating student engagement more fully into the curriculum and instructional program. To take Clemson’s HIPs to the next level, a committed team of faculty and staff set out to identify gaps related to HIP taxonomy on campus, explore disparate data collection, determine evaluation activities and opportunities for collaboration, and promote common reporting to advance efforts to achieve ClemsonFoward Engagement goals.
For years, Clemson had wanted to reproduce a report that connects their NSSE data and disaggregates HIPs by student demographic factors that is based on the data that can be verified at the institution.
By engaging in this project, Clemson was able to map, define, and identify data while also understanding where and who was engaging in this process and areas that needed improvement. This work entailed partial use of institutional-level data, partial use of programmatic-level data, and some good old-fashioned walking around campus to talk to those who had pieces of the student engagement puzzle. However, much of their data is not centralized which prevented from engaging in learning more about the High Impact Practices (HIPs) that they were engaging in across their campus. They began this process by identifying the institutional scope of HIPs and conducted a two-part focus group with faculty and staff stakeholders. They also made use of the partial data they had and combined it with data that was available at their campus through various offices. The longitudinal aspects of this project was carried out with a entering cohort of students by identifying each students' involvement in high-impact practices each semester for the entire time they spent at the institution or for six years until they left the institution with or without a degree.
With an intent to provide a roadmap for other institutions to fully understand engagement and drive change on their campus, they decided to focus on activities that fit the criteria of high impact practices or better known as HIPs.
The list of HIPs includes activities such as first-year seminars & experiences, learning communities, student-faculty research, study abroad and diversity/global learning, service-learning, internships, senior experiences/capstones, common intellectual experiences, writing intensive courses, and collaborative assignments and projects. However, through research they also determined that simply creating a list of HIP activities does not contribute towards significant change at the institution. Quality institutional alignment for HIPs has implications for improved curricular planning, strategic initiative generation, and democratization of data for decision-making. They found through discussions at their own institution and their counterparts at other institutions that integrated data systems for HIPs and collaboration across units on shared values are essential to an engaged institution, a belief also supported in the literature (Kinzie et al., 2015; Kinzie & Franklin, 2020; Nadasen & Alig, 2021).
Kinzie, J., Cogswell, C. A., & Wheatle, K. I. E. (2015). Reflections on the state of student engagement data use and strategies for action. Assessment Update, 27(2), 1–16. https://doi.org/10.1002/au.30013
Kinzie, J., & Franklin, K. (2020). Twenty years of NSSE data use: Assessment lessons for the collective good. Assessment Update, 32(2), 4–15. https://doi.org/10.1002/au.30206
Nadasen, D., & Alig, J. (2021). Data analytics: Uses, challenges, and best practices at public research universities. Retrieved from https://www.aplu.org/library/data-analytics-uses-challenges-and-best-practices-at-public-research-universities/file
Trogden, B.G., Kennedy, C. & Biyani, N.K. Mapping and Making Meaning from Undergraduate Student Engagement in High-Impact Educational Practices. Innov High Educ (2022). https://doi.org/10.1007/s10755-022-09608-7
A theme began to emerge in the students’ comments around the topic of diversity, equity, and inclusion. Students expressed concerns related to who is valued, what is valued, and the importance of reflecting more diverse identities on campus
At Marian University, assessment and learning are viewed as inextricably linked, so much so that the Director of Educational Assessment, Dr. Tony Ribera, works within Marian’s Center for Teaching and Learning and serves on the Teaching and Learning Committee along with faculty and staff from across departments. This committee includes an assessment activities subcommittee whose members serve as key players in examining and sharing NSSE results and information.
Tasked with the review of Marian’s NSSE 2019 results, the assessment activities subcommittee chose to focus on both the quantitative and qualitative data during multiple subcommittee meetings. Subcommittee members used this time as well as a shared Canvas page to read and discuss the NSSE data and reports, including students’ comments in response to the survey’s open-ended prompts. Although these comments were viewed and analyzed in an informal manner, key themes and trends emerged from them that were still quite impactful.
While the quantitative data suggested that students were engaging with diverse peers relatively often, the qualitative data—the student comments—provided another dimension and a more detailed picture of diversity on campus.
The open-ended prompt Marian students received for their comments was:
“What one change would most improve the educational experience at this institution?”
As members of the assessment activities subcommittee viewed the approximately 200 responses, a theme began to emerge around the topic of diversity, equity, and inclusion. Students expressed concerns related to who is valued, what is valued, and the importance of reflecting more diverse identities on campus.
It became clear to the subcommittee that Marian University could do and should do better by its students and—based on student comments—that many of these changes were possible to implement. Since the review and presentation of NSSE 2019 data, several changes have occurred on the Marian campus related to inclusivity and learning. The institution has established a new framework, created by faculty and staff, to outline what a learning experience at Marian should look like and to guide educators in creating more impactful learning. This new framework incorporates inclusive practices such as ensuring students feel valued and respecting diverse perspectives. The Center for Teaching and Learning has also developed a professional development certificate focused on creative inclusive experiences. Those interested in receiving the certificate are required to participate in four events around topics such as implicit bias and inclusive pedagogy. Thus far, over 100 educators at the institution, including faculty and staff, have registered.
These initiatives have not been limited to introducing strategies for how students learn but also include new strategies for assessing inclusivity in teaching. More recently, the university has emphasized inclusive teaching in its annual program assessment planning and reporting process as well as in its seven-year self-study process. Each program is developing its own assessment plan that will ask explicitly what the program is doing to foster inclusive classroom environments.
Throughout this process, strong connections have been created between faculty and staff. Student affairs professionals have utilized their expertise to facilitate workshops and to assist faculty in understanding concepts related to diversity, equity, and inclusion. Although difficult conversations have occurred during the workshops, the facilitators have been able to expertly handle these situations and to help participants grow in their understanding of inclusivity in the classroom. These new initiatives have been helpful in creating strong relationships across student affairs and academic affairs.
The Teaching and Learning Committee and Marian University as a whole have worked diligently to respond to students’ feedback and to make concrete changes to the campus. Utilizing quantitative and qualitative NSSE results allowed for a more nuanced and intentional approach to teaching and learning—with a stronger focus on inclusivity.
Creating Readable, Meaningful and Actionable NSSE and FSSE Reports
Florida Agricultural and Mechanical University (FAMU), a Historically Black College and University (HBCU), found a way to disseminate their NSSE and FSSE results in an easy-to-read, meaningful format that engaged a range of campus entities. In fact, FAMU’s inclusive use of NSSE and FSSE results was a hallmark of the institution’s comprehensive approach to assessment that distinguished them as a 2020 Excellence in Assessment Designee.
FAMU’s approach to sharing NSSE and FSSE results exemplifies the importance of tailoring reports and using data to inform productive campus conversations. Specifically, their Office of University Assessment created two reports, “Results of the 2017 National Survey of Student Engagement” and “Comparative Analysis of the 2017 NSSE and 2018 FSSE” based off their NSSE 2017 and introduction to the NSSE and FSSE, a comparative summary of student engagement results, population demographics of the faculty, select item comparisons, and a thematic summary of open-ended items. The executive summary concisely identifies major highlights from the report and indicates how the responses of FAMU’s first-year and senior students compare to those of other institutions. Highlights include that FAMU first-year students and seniors perceived higher levels of engagement with faculty when compared to other students from institutions in the Peer and Aspirational category, Carnegie Classification category and NSSE 2016 & 2017 aggregate. The NSSE & FSSE results show that faculty perceived students to spend considerably less time preparing for class and more time in leisure activities; while students conversely reported that they spent considerable time preparing for class and less time in leisure activities, resonated with several campus audiences. These selected findings quickly helped campus partners identify how they compare to other peer institutions, and the combined NSSE & FSSE results showed areas of congruence and misalignment in perception between faculty and students.
The reports also explain the Engagement Indicators and High-Impact Practices and highlighted the select item comparisons for the five questions on which first-year and senior students scored the highest and the five questions on which they scored the lowest relative to students in the comparison group. The combined NSSE & FSSE report discussed how the comparison results contain a great deal of information; but rather than analyzing and reporting all items, the report intentionally highlighted key indicators of areas of growth or improvement as reported by both faculty and students.
These reports provide a great summary of FAMU’s NSSE and FSSE results and provide a model for other campuses as they work to share their results and engage campus partners in using evidence to improve.
Playing in the Sandbox for Student Success
Ensuring student success is a top priority at the University of Rhode Island (URI), where since 2003 NSSE has been used to measure students’ perceptions of the institution. More recently, URI’s Vice President of Enrollment Management formed the Student Success Team, bringing together about 30 faculty, staff, and administrators from across multiple departments and disciplines who were to focus on removing structural impediments to student learning and improving graduation rates. Two initiatives the team has already implemented are curriculum maps for every major and the Take 5, Finish in 4 program that encourages students to complete five three-credit courses every semester in order to graduate in four years. Measuring the effectiveness of these programs requires a comprehensive assessment plan that includes NSSE data.
Recognizing the contribution of people’s time and resources while meaningfully sharing and utilizing results is also a top priority at URI. The Student Success Team has been intentionally working to create a deeper understanding of data and action items among individual faculty and across the institution to improve student learning. A new team initiative, called the Sandbox, was implemented to bring together educators from across campus to dig into URI’s NSSE results.
The Sandbox, emulating the idea of playing with the data, brought together approximately 40 faculty and staff to learn more about NSSE and student assessment. The Sandbox took place in an “active classroom” within the institution’s library—a space equipped with a dozen electronic whiteboards connected to laptops in the room, allowing small groups of participants to play with the data, take notes, and present their own findings.
The Sandbox session began with an introduction to NSSE, including a description and explanation of the items, survey themes, Engagement Indicators, and additional survey questions. After familiarizing participants with the survey, members of the Student Success Team shared examples of results the team had already analyzed. To keep participants engaged, facilitators shared graphs of their analysis and asked participants to interpret the data, to share their conclusions, and to ask any further questions they may have had about the data.
One facilitator example focused on a survey item on how students use their time. Findings suggested class level was not a significant factor, with both first-year and senior students averaging 15 hours per week on coursework outside of class. To showcase opportunities for digging deeper into results, facilitators disaggregated data further by major, finding nursing majors actually above average in hours spent on coursework. This walk-through helped the faculty follow the data down to deeper and deeper levels to gain more confidence for their own analyses.
At the University of Rhode Island, data are not seen as stagnant but as opportunities to better understand the institution and its students.
Once faculty and staff had a chance to familiarize themselves with the assessment process and the NSSE survey, they were provided an opportunity to complete an analysis of their own. The Student Success Team created a spreadsheet containing all the URI data from the NSSE core survey as well as two modules, First-Year and Senior Transitions and Academic Advising. Participants were able to copy and paste cells into a separate tab that updated result charts automatically. The goal was to have everyone, in small groups, browse the survey to find questions related to their particular area of interest, update the spreadsheet, and write conclusions on the whiteboards. At the end of the 90 minutes, participants felt more comfortable with their understanding of assessment data and more confident in their abilities to incorporate assessment into their own practice.
The Sandbox was a huge success. To maintain the assessment momentum, the Student Success team has offered additional incentives. One invited attendees to come up with a research question answerable by NSSE data and to present it to the assessment office to receive funding for their research. Individuals or groups who receive this funding are then asked to present their findings to other campus constituents.
The University of Rhode Island is a data-informed institution, where data are not seen as stagnant but as opportunities to better understand the institution and its students. By creating an active culture of assessment, URI is turning rich data about their students into actionable knowledge to improve the campus community.
*Acknowledgement: After contributing to this account of NSSE data use, Dr. Gary Boden passed away in July 2020. At the time, he was Senior Information Technologist in the Office of Institutional Research at the University of Rhode Island.
Using Data to Understand and Serve a Unique Student Population
Every institution is unique, with its own story to tell. While many schools may see NSSE as a comparative tool to measure themselves against similar institutions, Indiana University Northwest recognizes the need to contextualize their data in ways that help tell their story and make improvements to their campus. Located in Gary, Indiana, IU Northwest is a regional commuter campus within the IU system, serving approximately 3,800 students. As a commuter campus, how IU Northwest promotes the survey and what it looks for in the data, is distinct.
John Novak, assistant vice chancellor of institutional effectiveness and research, has found success partnering with others at IU Northwest, who then provide information and promote the survey to their deans, departments, faculty, and staff. Having moved away from paper flyers and closed-circuit television ads, their most effective forms of promotion have been online via the school’s LMS system on Canvas, the institutional website, and direct emails. In addition to receiving promotional messages about NSSE online, students who complete the survey also receive $3 added to their Crimson Card (a student ID and debit card for spending around campus). Recognizing that students respond well to receiving feedback early and often, incentives are provided at the end of each week instead of when the survey closes.
Instead of trying to change the behaviors of students whose lives are busy with competing priorities such as work and family, the need is about meeting students where they are—in the classroom.
IU Northwest faces challenges like other commuter campuses. The students tend to come to campus for classes only, and their NSSE responses reflect this. Students report lower co-curricular engagement and engagement outside of the classroom compared to peer institutions. But as John Novak has shared, rather than a deficiency, this indicates student engagement just looks different at IU Northwest. Instead of trying to change the behaviors of students whose lives are busy with competing priorities such as work and family, the need is about meeting students where they are—in the classroom.
Much of the focus at IU Northwest is on student-faculty interactions and course-based pedagogical initiatives to improve student engagement and performance. While co-curricular engagement may be low, IU Northwest students report positive interactions with faculty and fellow students in the classroom. Results like these assist in both acknowledging the work faculty put into creating an engaging classroom as well as finding areas for future curricular improvement. One area the institution is exploring is campus climate and cross-cultural development.
Recently classified as a Hispanic-Serving Institution, IU Northwest has a diverse student population. And while many may think having a diverse student population automatically means inclusion, NSSE results show there is room for enhancing diverse interactions. First-year students report lower interactions with diverse others compared to seniors. While IU Northwest is considered an urban campus, it also attracts students from rural and suburban locales. Many students come from segregated communities, and IU Northwest may be one of the first places they interact with students from another culture. It takes time to grow in this area and step out of your comfort zone as a student.
Even with this contextualized understanding of IU Northwest student engagement, there is still room to explore and understand what IU Northwest students need. Moving forward, John has plans to use NSSE data to examine the campus climate. This includes combining NSSE data with other survey and assessment data already available on campus as well as with data from focus groups—to shed light not only on students’ needs but also on the needs of the surrounding community of Gary.
The Beloit College Office of Institutional Research, Assessment, and Planning (IRAP) reviews their NSSE student comments data for specific mentions of offices, services, and people. Compliments about the quality of services, the helpfulness of staff, or the encouragement of faculty are passed along to appropriate individuals. By sharing positive and sometimes constructive feedback from the Student Comments report, IRAP generates goodwill about NSSE and Beloit’s participation while also promoting the value of student voices in assessment.
NSSE Student Comments Report: At the end of the core survey, students are invited to express their opinions about their college learning experience in a space for up to 5,000 characters. Institutions can choose one of four open-ended prompts: 1. If you have any additional comments or feedback that you’d like to share on the quality of your educational experience, please enter them below; 2. What has been most satisfying about your experience so far at this institution, and what has been most disappointing? 3. Please describe the most significant learning experience you have had so far at this institution; 4. What one change would most improve the educational experience at this institution, and what one thing should not be changed?
Using NSSE Data in Strategic Decision Making for Advising
Dr. Elsa Núñez arrived as new president at Eastern Connecticut State University in 2006 with a reputation as a proponent of strategic planning and data-driven decision making. Within a year, more than 300 faculty and staff were hard at work crafting the first five-year strategic plan of her presidency. (As of this printing, the university is in the third planning cycle of the Núñez administration.)
One of the most important elements of Eastern’s 2008–2013 Strategic Plan was a multi-tiered advisement program driven by results from the National Survey of Student Engagement (NSSE) to overcome the politics of change. This initiative created an advising program that Eastern depends on to serve students and to help them persist on their path to academic success.
When the strategic planning committee charged with supporting student success looked at Eastern’s NSSE 2010 data, they paid particular attention to students’ written qualitative responses to the survey’s open-ended questions. Student comments, such as the two below, clearly indicated that Eastern’s advising system was broken and that depending on faculty to advise students wasn’t working.
“Trying to figure out my major was hard, because I was not advised well at all.”
“I’ve seen my advisor only once all year; she causes me more stress than my schoolwork!”
The hard data from NSSE were also compelling. When asked if they talked with a faculty member about their career plans, only 46 percent of freshmen and 49 percent of seniors said yes. Asked to evaluate Eastern’s academic advising program on a scale of 1–5, students rated it only 2.9.
Eastern’s success in using data for strategic decision making in student advising has built confidence in using data to make other critical decisions that require innovation and change.
While the problem was clear enough, how to build a better advising system—one that the administration could sell to the faculty—was a challenge. Dr. Núñez felt she needed a faculty champion who could help to gain faculty buy-in and to ensure implementation at the academic department level. She found such a champion in an environmental earth science professor—a scientist and an award-winning teacher respected by his peers and loved by his students.
President Núñez asked this professor to work with her to convince the faculty that the new advising model—far from taking advising away from them—supported their natural role as mentors for their students. The plan was for a professional advising office to take over some aspects of advising outside of the faculty’s subject matter expertise so that professors could focus on providing students with program- and course-specific counseling and support. Faculty would also continue to have the critical role of advising students on career opportunities in collaboration with the Center for Internships and Career Development.
Dr. Núñez and her faculty champion went to each academic department to share NSSE data. They reminded the faculty that surveys such as NSSE are typically completed by self-motivated, higher-achieving students. If these students were having problems with the advising program, odds were good that the program needed to be stronger.
These discussions with faculty were not quiet conversations, Dr. Núñez recalls. The faculty challenged the model being presented and questioned the findings, but NSSE results were hard to ignore as they came directly from students. The fact that the faculty champion was a highly respected research scientist with student-centered classes was a major reason why the faculty were finally convinced to endorse the new advising model.
A student academic advising committee—also led by faculty members—was created to finalize the plan for a multi-tiered advising model. The new structure included a newly staffed office of professional advisors; clear roles for that office and for faculty; and programs to provide advising at four critical stages in a student’s life: (1) pre-enrollment, (2) first-year experience, (3) choosing a major, and (4) career planning. Eastern even brought advising into the residence halls so that students are “at home” when talking about their academic and career futures.
Using funds from a Title III grant, as well as other university resources, Eastern invested $4 million in the advising program. The year after the program was implemented, student satisfaction rose from 69 percent to 78 percent. NSSE data showed that from 2008 to 2012 student ratings increased 31 percentage points for faculty accessibility, 11 points for Eastern as a supportive campus, and 12 points for prompt feedback from faculty.
More recent data from NSSE 2017 compared Eastern to its peers in the Council of Public Liberal Arts Colleges (COPLAC) and found Eastern students outperforming their COPLAC peers when it comes to discussing careers and topics beyond the classroom with faculty.
Retention at Eastern has risen as well. The recent 2018 freshman-to-sophomore cohort was at an all-time high of 79.3 percent, up more than two percentage points from 2017 and almost six points from a decade ago. Knowing that this measure impacts graduation rates, even though Eastern’s four-year graduation rate is the highest in the Connecticut State University System, the university continues to work on it. Most important, however, is the success of individual Eastern students.
Eastern’s improvement of student advising is a good example of how the university uses data in making strategic decisions, and this success has built confidence in using data in other critical decisions that require innovation and change. Only by listening to student voices can colleges and universities ensure that the changes we make improve educational outcomes. Sharing credible data—the hard quantitative data as well as the anecdotal, qualitative data found in NSSE results—is a powerful way to mobilize faculty in leading change efforts and in making decisions to enhance student learning.
Enhancing High-Impact Practices
In their 2015–2020 Quality Enhancement Plan (QEP) submitted to the Southern Association of Colleges and Schools Commission on Colleges titled “Experiential Learning@MGA,” Middle Georgia State University (MGA) planned to offer students an array of experiential learning opportunities including several High-Impact Practices (HIPs), with the goal of reinforcing the “student-centered focus of the university’s strategic plan.”
The experiential learning approach was selected after analysis of NSSE results and internal assessment data indicated MGA students were participating in some HIPs less frequently than their peers at comparison institutions. For example, NSSE data showed MGA seniors participated less often in undergraduate research, collaborative learning, and service-learning.
MGA’s QEP is designed to foster students’ progress through four tiers of experiential learning activities. Students are introduced to the QEP and experiential learning ideas at a “bronze level” event prior to their first experiential learning course or activity. They then have the opportunity to achieve “silver level,” “gold level,” or “platinum level” by completing additional qualified experiential learning courses and activities throughout their time at the university.
MGA developed a rubric with specific evaluative criteria that allows them to qualify courses and activities as experiential learning and to help ensure consistency across these experiences. As MGA carries out their phased implementation of this QEP, NSSE will serve as an important assessment tool.
Putting Student Comments to Use
San Francisco State University (SFSU) analyzed responses to the prompt “What one change would most improve the educational experience at this institution, and what one thing should not be changed?” Comments revealed three salient themes: class availability, graduation, and diversity. SFSU is using specific results to support positive aspects and minimize negative issues found in comments associated with these themes to help increase student engagement. SFSU’s student comments are displayed in attractive, colorful infographics available at the Institutional Research website.
NSSE Student Comments Report: At the end of the core survey, students are invited to express their opinions about their college learning experience in a space for up to 5,000 characters. Institutions can choose one of four open-ended prompts: 1. If you have any additional comments or feedback that you’d like to share on the quality of your educational experience, please enter them below; 2. What has been most satisfying about your experience so far at this institution, and what has been most disappointing? 3. Please describe the most significant learning experience you have had so far at this institution; 4. What one change would most improve the educational experience at this institution, and what one thing should not be changed?
Putting Student Comments to Use
At Southern New Hampshire University (SNHU) University College campus, NSSE results and questions that arise from them serve as the content of a one-credit School of Education course, “Inquiry Scholars.” Each semester, students enrolled in this course are asked to take up an authentic problem related to improving student learning that can be illuminated with their campus data. After SNHU’s administration of NSSE 2017, eight Inquiry Scholars classmates completed an analysis of the open-ended NSSE item, “What one change would you most like to see implemented that would improve the educational experience at this institution, and what one thing should not be changed?”
The Inquiry Scholars put each comment from the 270 respondents who answered this question on a strip of paper and sorted these into thematic affinity groups. After analyzing the results by gender and year, they shared their findings with more than 150 faculty and staff members. Faculty, in turn, were asked to answer the same prompt during this event, and the Inquiry Scholars analyzed those results as well.
NSSE Student Comments Report: At the end of the core survey, students are invited to express their opinions about their college learning experience in a space for up to 5,000 characters. Institutions can choose one of four open-ended prompts: 1. If you have any additional comments or feedback that you’d like to share on the quality of your educational experience, please enter them below; 2. What has been most satisfying about your experience so far at this institution, and what has been most disappointing? 3. Please describe the most significant learning experience you have had so far at this institution; 4. What one change would most improve the educational experience at this institution, and what one thing should not be changed?
It Takes a Committee: Improving Mizzou’s NSSE Response Rate
The committee’s NSSE Campus Tour revealed the close connection between the various groups across the university and the needs and activities of institutional research and assessment.
The University of Missouri resolved to use its NSSE results for strategic planning starting with the 2018–19 school year. To have more reliable data for planning and assessment, the university set the goal for its response rate at 30%—nearly double its previous response rate of 17%.
Mizzou’s Vice Provost for Institutional Research, and the Vice Provost Undergraduate Studies, in partnership with the Interim Vice Chancellor for Student Affairs formed a committee to focus on boosting NSSE response rates as well as increasing uses of NSSE data. The committee membership represented the Division of Student Affairs; the Office of Diversity, Equity, and Inclusion; the Office of Institutional Research; the undergraduate deans; the Honors College; the Center for Academic Success and Excellence; and an assistant professor in the Department of Educational Leadership and Policy Analysis who uses NSSE data in his research and explained in depth how NSSE data could be used by the different campus areas.
The main areas of the university’s strategic plan concerning the committee were those of students’ success and experiences. While the committee did not have a specific question in approaching NSSE data, they resolved during the year to develop a full plan for the use of NSSE data.
The NSSE Committee carried its message across the Mizzou community in a NSSE Campus Tour, meeting with advisors, undergraduate deans, social justice centers, and any other campus groups that regularly interact with students. The committee made presentations about the value of NSSE data, explaining how and why NSSE is important and how each group can use NSSE results. They also discussed methods to increase response rates of response to the survey.
These discussions revealed the close connection between the various groups across the university and the needs and activities of institutional research and assessment. This served to increase investment among the various representatives of these groups to more vigorously promote the survey to their students.
The Undergraduate Dean advocated using Canvas learning management system software this year. To advertise the survey, digital display screens across campus were used. Social media provided students easy access to Canvas. To boost the response rate still higher, students were offered attractive incentives—which had administration support because of the importance of NSSE in the strategic plan.
The incentives were a chance to win an Apple Watch Series 3, a $1,000 gift card for an Apple Product, an MU parking pass, or a $100 Mizzou Store gift card. These incentives certainly helped, but the buy-in from academic units helped even more. Fully 60% of the first-year students’ survey responses came via Canvas (the university’s learning management platform), a mode for responding that students likely considered trustworthy.
Mizzou’s campaign resulted in a final response rate of 44%—surpassing the university’s ambitious goal!
Looking ahead, the committee has noted that NSSE will be useful in continued implementation of the university’s strategic plan. The committee intends to meet again in the fall, when they will share this year’s NSSE results and develop methods to use them across campus. The work of the NSSE Committee at the University of Missouri demonstrates that campus partnerships are essential to the success of efforts to promote survey participation and to use survey results to improve undergraduate education.
Lessons From the Field—Volume 4: Digging Deeper to Focus and Extend Data Use
California State University San Marcos
Multi-Year Findings Spark Efforts to Improve Feedback to Students (Lessons from the Field–Volume 4)
According to Andrew University’s results from NSSE 2013, their students received feedback from faculty less frequently than students at comparison institutions—specifically, in the extent to which their instructors provided (a) feedback on a draft or work in progress and (b) prompt and detailed feedback on tests or completed assignments. Examining responses to these two survey items, the Office of Institutional Effectiveness noticed that the university’s average score was lower than those of the comparison group, the peer institutions, and NSSE overall that year. When these findings were presented to faculty, however, they were met with skepticism—and with comments such as “I give grades back in a week”—motivating the presenters to further investigate this aspect of education and to attempt to expand the understanding on their campus of what constitutes effective feedback.
To mitigate possible faculty apprehension about NSSE data, the office conducted a separate follow-up student survey focusing on feedback from faculty. Students were asked about the value of different types of feedback such as opportunities outside of class to ask the instructor questions, rating scales with detailed descriptions of performance, rubrics for grading, and written comments. Students were also asked about the timeframe within which feedback should given for different types of assignments (e.g., drafts of papers or projects; quizzes and short assignments; long assignments, papers, or projects; and major exams).
The results from this survey indicated that over 80% of students found most forms of feedback either “valuable” or “very valuable” and that they expected feedback in the next class period for quizzes and short assignments and within a week for larger assignments. These findings showed that Andrews University students found multiple types of feedback (beyond grades alone) valuable to their education and that the students had reasonable expectations regarding the timeframe for feedback. Presented at the general faculty meeting in April 2014, the findings informed faculty of the multiple ways they could provide feedback to students and deepened their understanding of students’ needs and expectations regarding feedback.
To evaluate the effects of this intervention, the same office compared the university’s NSSE 2013 and NSSE 2015 scores. Using their Multi-Year Report from NSSE, researchers were able to track the change in the Student-Faculty Interaction Engagement Indicator—a factor comprising four NSSE items, two of which (mentioned above) began this campus conversation. By using that report, the Assistant Provost of the Office of Institutional Effectiveness was able to see improvements in student engagement related to interaction with faculty by both first-year and senior students. In this effort, Andrews University used NSSE data to identify an area of concern; to explore it further on their campus; to provide faculty with actionable evidence on how to improve their teaching; and, by comparing old and new results in their Multi-Year Report, to measure the intervention’s effects.
Using Results to Incorporate Diversity on a Faith-Based Campus (Lessons from the Field–Volume 4)
Results from Biola University’s first administration of NSSE, in 2013, indicated lower scores than those of their peer groups on the Discussions with Diverse Others Engagement Indicator. For their administration of NSSE 2015, Biola intentionally customized their comparison groups and had similar findings—providing the basis for investigating further their students’ engagement with individuals different from themselves. In an effort to fully understand these data, Biola conducted additional analyses including individual item analysis, disaggregating by race/ethnicity, and reviewing open-ended responses for diversity-related themes. Among the findings that stood out, compared to their peers at other faith-based institutions, Biola students scored lower on items querying the frequency of discussions with “people with religious beliefs other than your own” and “people with political views other than your own.”
The NSSE findings were especially noteworthy given the responses of Biola students on the Taylor University Christian Life survey indicating that over 90% of them felt the institution had helped them connect their faith with culture and society. These potentially conflicting findings called for deeper probing, inspired new conversations on campus, and raised the question: What is Biola doing to prepare students to truly engage with culture and society, particularly with individuals who are different from them?
All of these findings were shared with the University Academic Council, which is chaired by the Provost and consists of academic deans and members of the Provost’s cabinet, promoting a powerful campus discussion on how the institution was incorporating diversity into the curriculum. Using data from the various sources helped the council identify where students are exposed to diversity as well as opportunities to introduce diversity within the curriculum; for example, the council considered ways to incorporate diverse voices and texts in required theology courses. To encourage faculty to incorporate a more diverse curriculum, as part of Faculty Investment Day, faculty were offered a one-day training opportunity including breakout sessions and faculty panels with titles such as Teaching the Complex and Controversial: Practical Strategies for Engaging Students in Transformational Learning; The Black Lives Matter Movement, Evangelical Churches, and Biola Classrooms; Engaging Online Students in Cross-Cultural Learning; and Transforming the Classroom into a Real Life Experience: Engaging Students Cross-Culturally in the Community.
While Biola continues thinking about how to address the diversity-related NSSE findings on their campus, a staff member from the office of the Vice Provost of Inclusion and Cross-Cultural Engagement has been added to the undergraduate curriculum committee to help them critically examine how the curriculum addresses diversity. Biola also intends to continue the conversation about creating more opportunities for faculty training in pedagogy and inclusion in the classroom.
Student Learning Analysts Build Campus Interest and Investment in Assessment Students (Lessons from the Field–Volume 4)
Using assessment data innovatively at Bowling Green State University (BGSU) has become a priority in the last few years. To aid in this effort, in 2016, the Office of Academic Assessment created the Student Learning Analysts (SLA) position, “in which undergraduate students take an active role in gathering information on student learning experiences,” to help ensure student voices are truly represented in the assessment of student learning—including in the interpretation of the data and the recommendations for practice. Students in the SLA position learn to design assessment projects, collect and analyze data, and present findings to various members of the BGSU community. The Vice Provost of Institutional Effectiveness and Associate Director of Academic Assessment, who work with these students, believe the SLAs support the larger data-driven philosophy on campus and increase investment by campus units in institutional assessment work. Incorporating students’ interpretation of data and recommendations for practice can have a significant impact on campus unit decisions.
Students hired for the SLA program were drawn from a variety of majors, class standings, and experience levels—but all demonstrated an interest in assessment and student learning. After receiving training in assessment techniques, the SLAs started their projects. In their first semester, they conducted focus groups related to students’ expectations about learning and engagement in the classroom (see Figure 9). In their second semester, as they developed assessment projects directly related to NSSE, they learned about engagement and the types of data NSSE provides—and used these new skills to craft focus group questions related to three NSSE Engagement Indicators: Learning Strategies, Higher-Order Learning, and Reflective and Integrative Learning. One of the focus group questions related to Learning Strategies was “How do you study and review your notes?” Another question, related to Higher-Order Learning, was “How is critical thinking applicable in other aspects, such as internship, organizations, etc.?” In a question related to Reflective and Integrative Learning, the SLAs asked “Explain how your classes help you look at issues or topics with a new perspective.” Following the focus groups, the SLAs analyzed and coded their data and began to identify findings to share with various groups on campus (e.g., Teaching and Learning Fair, General Education Committee, Faculty Administrator groups, etc.). The SLAs are also committed to finding unique visual ways to share their findings to make them as accessible as possible.
Although still new, the SLA program has already seen some unintended—but positive—outcomes. Students who participated in the focus groups, for example, have expressed interest in the SLAs’ assessment work and are thinking about how they can use assessment to inform their own experiences (e.g., activities with student organizations). To build on this growing interest, the Office of Academic Assessment at BGSU is considering ways to expand the SLA program in the future after ensuring its short-term success.
Sharing and Using NSSE Data to Drive Sustainable Improvement (Lessons from the Field–Volume 4)
In 2014, during a board of trustees meeting, Bucknell University President John C. Bravman outlined five attributes critical to the institution’s long-term sustainability: being forward looking, data driven, highly intentional, prudently bold, and student centered. Applying that vision to become highly intentional about sharing and making data as accessible as possible to administration, faculty, staff, students, and external constituents, Bucknell has developed a number of dashboards focused on specific topics.
One of these dashboards, dedicated solely to NSSE data, provides means for each of the NSSE Engagement Indicators (EI) and frequencies for the items they comprise and, further, allows users to disaggregate data by race, gender, residential college, Greek life affiliation, Pell recipient status, and first-generation status. On a number of other dashboards, NSSE data supplement the institution’s internal data as well as data from other instruments. Bucknell’s Diversity Dashboard, for example, includes items from NSSE’s Discussions with Diverse Others Engagement Indicator, among others, and allows for comparisons by various student demographic characteristics.
The university’s Student Learning Outcomes web page pairs NSSE results with data from the Hart Research Associates survey of employer priorities for college learning and success (see Figure 1; www.aacu.org/leap/public-opinion-research/2015-survey-results) and with the Higher Education Data Sharing Consortium’s HEDS Alumni Survey (see Figure 2; www.hedsconsortium.org/alumni-survey). Results are also displayed of students’ participation in High-Impact Practices (see Figure 3).
Two additional dashboards in development at Bucknell will combine data from multiple surveys. One, the Campus Climate Dashboard, will be an invaluable resource for numerous campus offices by providing a summary of findings related to campus climate issues from NSSE, the College Senior Survey (CSS), the Consortium on High Achievement and Success (CHAS) survey, and the National College Health Assessment (NCHA) survey. The other, the General Education Dashboard, will provide a mix of direct and indirect measures (from NSSE and alumni surveys) that will support the assessment efforts of faculty and administrators.
The purpose of making data more accessible is to encourage departments and units across campus to use this information more effectively to improve practice. Demonstrating this, Bucknell has used NSSE data to review the impact on student success of participating in the Residential Colleges—living-learning communities that have been a part of campus life at Bucknell for 30 years.
For this analysis, Bucknell researchers linked NSSE data with institutional retention and first-year GPA data, which served as a proxy for first-year student success. Also, Beginning College Survey of Student Engagement (BCSSE) data were used to compare the pre-college and first-year experiences of Residential College participants and nonparticipants, controlling for student background characteristics.
The researchers found Residential College participation significantly linked to positive results for the Reflective and Integrative Learning and the Discussions with Diverse Others Engagement Indicators, participation in High-Impact Practices, and retention. Based on these findings, Bucknell has prioritized Residential College participation, achieving a nearly 40% increase in that participation over the last three years.
Bucknell has also used NSSE findings to enhance diversity initiatives on campus. Specifically, researchers looked at responses by racial and ethnic group to individual items in the Discussions with Diverse Others Engagement Indicator and in High-Impact Practices participation. Informed by these findings, changes were then made to the training for both Orientation Assistants (OAs) and Resident Advisors (RAs) to include new topics and offices focused on diversity and inclusion. The revised OA training includes a session on critically examining first-year students’ experiences through a diversity lens, specifically paying attention to the needs of students with disabilities and students with different religious and political views as well as those who have experienced exclusion or discrimination on campus. The revised RA training emphasizes diversity and cultural fluency as core themes and has sessions dealing with power and privilege, campus climate, identity development, and global and world events. Additionally, the revised RA selection and interview process incorporates considerations related to cultural competency and diversity.
Engaging Student Affairs in Student Engagement Improvements (Lessons from the Field–Volume 4)
The California State University (CSU) system has a clear goal: to increase graduation rates for all students across all 23 campuses to meet California workforce demands. Graduation Initiative 2025 outlines CSU’s key objectives for first-time first-year and transfer students. At California State University, San Bernardino (CSUSB), working toward these goals has meant ensuring all institutional divisions are involved in the process, including each student affairs unit. Cautious of too hastily developing and implementing new programs and initiatives, CSUSB’s approach has been to thoughtfully consider ways to increase intentionality and efficiency in work already being done on campus, identifying areas already improving as well as areas needing further improvement. Figures 4 and 5 show how these data have been made accessible to campus constituents.
The Department of Housing and Residential Education (DHRE) has used NSSE to assess the impact that living on campus has on student engagement, comparing effects not only of on- and off-campus living but also of specific housing programs (e.g., faculty-in-residence, sustainability programs). DHRE’s assessments of various initiatives have looked at the relationship of students’ living environment with NSSE Engagement Indicators and High-Impact Practices participation to determine which DHRE practices have the most impact on student learning and success. These findings are especially important as the institution moves toward increasing the number of students living on campus. Simply getting more students to live on campus is not enough; their experience must intentionally offer the resources and support to assist them toward graduation.
Other CSUSB offices find NSSE gives insight into populations of students who face unique challenges that other surveys may not capture. Services to Students with Disabilities (SSD), for example, searches NSSE data for trends among students with disabilities. These data combined with resources from the Council for Learning Disabilities inform the development and implementation of SSD’s strategies. The Veterans Success Center (VSC), using NSSE data to inform programmatic decisions about how best to support student veterans, created a Veterans Learning Community where military-affiliated students receive support in transitioning to the university (e.g., selecting courses, choosing a major, understanding campus requirements), participate in a seminar series to enhance academic skills (e.g., study practices) and personal skills (e.g., social networking), engage in community service and family-based activities, explore career options, and prepare for life after CSUSB. To develop coping skills for life challenges on the path to graduation, student veterans also receive on-site academic support, personal development and academic skills building workshops, a mental health support group, community enrichment projects, and family engagement activities. NSSE data were also instrumental in the development of a dedicated tutoring program and study space for military-affiliated students. VSC has partnered with the Communications Studies Department in a collaborative effort featuring military leaders; VA representatives; and CSUSB staff, faculty and student veterans to expand
faculty training for successfully instructing and interacting with military-affiliated students.
To support students of color from communities who have historically graduated at lower rates, CSUSB has recently opened three student success centers: the Pan-African Student Success Center, the LatinX Student Success Center, and the First People’s Student Success Center. NSSE data along with Lumina, AAC&U, and institutional and systemwide data informed the need for these centers—as illustrated, for example, in CSU system’s very low graduation rates for First Nations students. Increasing resources and support for all students will be central to CSUSB’s work toward meeting the Graduation Initiative 2025 goals.
Lastly, to improve the transition experience for both students coming from high schools and those transferring from community colleges, CSUSB’s orientation programs have become transition programs—and NSSE data have been embedded in this change. The new student convocation, for example, incorporates NSSE data on students’ engagement on campus and supplements this with student leaders sharing stories about their own engagement and encouraging new students to take advantage of campus support and resources. Given the many ways CSUSB is working to improve the student experience and increase engagement, its next NSSE administration will be important for assessing the impact of these strategies.
NSSE Data-Informed HIP Planning and Accreditation Reporting (Lessons from the Field–Volume 4)
California State University San Marcos (CSUSM) is a Hispanic-Serving Institution that is focused on the students they serve, an undergraduate population among whom 42% identify as Latino/a, 54% are first-generation, and 47% come from low-income backgrounds. Institutional researchers at CSUSM use NSSE data to learn more about their student population and to provide evidence that can be used to best serve them—for example, to confirm that CSUSM students spend more hours working for pay than do students at similar institutions. Findings like this inform the collective understanding of the student population and guide campus conversations on how to best support these students.
NSSE data are also used to underpin efforts like the benchmarking of Co-Curricular Competencies conducted annually by the Division of Student Affairs. The division reorganizes data from NSSE and from the Cooperative Institutional Research Program (CIRP) College Senior Survey under domains such as Civic Engagement and Social Responsibility or Critical Thinking and Ethical Reasoning, using the data to inform conversations regarding student learning in these areas.
High-Impact Practices (HIPs) such as the first-year seminar, internships, and undergraduate research are a leading priority at CSUSM. Emblematic of this institutional emphasis, a HIP task force composed of faculty, staff, and administrators with interest or involvement in campus HIPs used NSSE data to disaggregate student participation in these practices by student major and demographic characteristics. These data can help educators identify student groups that are less likely to participate in HIPs and to direct them to HIP opportunities—interventions that are especially impactful for students such as first-generation or low-income students who might not otherwise seek out these opportunities.
CSUSM stakeholders have used NSSE data to measure the overall effect of efforts to improve HIP participation, and the data suggest interventions like these are working. Encouragingly, results from the institution’s NSSE 2016 administration indicated that HIP participation has increased. Also, using common data reference points has facilitated cross-division collaboration at the university, as all entities work from the same data points and share a common framework for conversations to identify needs and plan interventions.
NSSE data have also played an important role in the CSUSM WASC Senior College and University Commission (WASC) accreditation process—providing evidence in their institutional report of achievements in university-wide Undergraduate Learning Outcomes (ULOs), mapped to WASC standards. For example, CSUSM found that their students were more likely than those at other California State University institutions to engage in behaviors associated with higher-order learning and used this information to articulate the ways in which their students had developed skills as Comprehensive and Critical Thinkers (one of the four ULOs). Similarly, for the Skilled Communicators ULOs, levels of satisfaction among students.
In concert with other data sources, such as CIRP’s freshman and senior surveys and the American College Health Association–National College Heath Assessment survey, CSUSM expertly aligned their own institution’s ULOs with WASC standards and used evidence from NSSE to highlight achievements in student learning on their campus. CSUSM is making progress toward establishing a culture of data to inform action and to demonstrate student learning outcomes.
the institutional report noted high scores for the frequency at which students give presentations in class and for crediting their university experience for the development of oral communication skills. Lastly, the CSUSM institutional report used NSSE data to illustrate high
Mapping NSSE Items and Developing Faculty (Lessons from the Field–Volume 4)
The Office of Institutional Research, Effectiveness, and Planning at Carlow University maximizes information derived from NSSE results by using data from both the core survey and the Topical Modules. In 2014, Carlow administered NSSE and participated in two modules: Learning with Technology and Experiences with Information Literacy. Analysis of these data contributed to the development of explicit guidelines for a new core curriculum and the improvement of instruction by faculty.
Mapping Carlow’s NSSE results to specific action steps was linked to the new core curriculum guidelines. For example, in one document the survey item, “worked with a faculty member on activities other than coursework” was connected to actions such as educating students about co-curricular opportunities in the “Connecting to Carlow” course and the development of a co-curricular transcript. The Office of Institutional Research, Effectiveness, and Planning developed a graphically enhanced chart that (a) identified the NSSE survey item, (b) compared Carlow’s performance with the national average (e.g., a “thumbs up/neutral sign/thumbs down” picture), and (c) listed all of the new core curriculum components intended to ameliorate the concerning findings.
By reimaging NSSE results in a single chart—or “crosswalk”—the office developed an easy-to-understand information display tool that clearly delineated connections between data and action. For example, low NSSE scores from seniors for faculty feedback on a draft or work in progress were addressed by creating various skill labs (i.e., academic support experiences to help students develop communication, writing, and quantitative reasoning skills), by implementing a writing-intensive curriculum in the critical exploration courses, and by embedding assessment checkpoints during junior year seminars.
Because NSSE results also indicated participation rates in some High-Impact Practices (HIPs) were lower at Carlow than at other institutions, an action step called for the inclusion of five HIPs in the core curriculum (writing intensive curriculum, capstone courses, service-learning experiences, internships, and research opportunities with faculty). Although several of these HIPs were already in the curriculum, a conscious decision was made to bolster and expand them in the new curriculum. Not only has the crosswalk streamlined conversations regarding interventions to enhance student engagement, the document also serves as an easy-to-reference guide for measuring the effectiveness of these interventions.
Carlow University plans to administer NSSE in 2018 and is excited to see if scores on the targeted items improve after implementing the new core curriculum, the Carlow Compass (see Figure 8). The Carlow Compass Curriculum, which went into effect for all incoming students in Fall 2016, is an innovative general education curriculum rooted in the liberal arts and the Catholic intellectual tradition. Serving as a navigational tool to guide students toward academic and professional goals, it is integrated with a student’s major course of study and aligns with the university’s mission, vision, and Mercy heritage.Carlow University also used NSSE results to support and guide strategic priorities in other areas of the academic enterprise. While excelling in many aspects of student engagement and practice, Carlow obtained results from NSSE’s Learning with Technology Topical Module indicating it lagged behind its peers in integrating technology into undergraduate education. In the module, students are asked about the degree to which technology contributed to their learning activities such as understanding course ideas and collaborating with other students. The module also includes questions regarding the types of technologies used in class and the degree to which the student’s institution emphasized various types of technologies. The module results indicated that Carlow students were less likely to use certain technologies inside the classroom (e.g., electronic textbooks, e-portfolios, social networking) compared with students at other institutions participating in the module.
As a follow-up action, Carlow organized an internal professional development institute for all faculty and staff in spring 2016. The institute’s theme—Back to the Future: Carlow’s Journey of Innovative Technology—focused on sharing faculty successes at implementing technology as a way to inform and motivate late adopters. The institute included a plenary event, three hours of multiple training sessions (topics included Interactive Software Adobe Connect and Camtasia, and Engagement and Motivation through Digital Tools), and an open-mouse session during the reception where faculty showcased their technology skills.
In both of these uses of NSSE data—the crosswalk of NSSE items with action steps for the new core curriculum and the examination of student use of technology in the classroom—Carlow University has done outstanding work using specific data points in the survey results to guide interventions. Furthermore, the Director of Assessment for Institutional Research, Effectiveness, and Planning has diligently mapped these items to specific interventions (either in changes to the core or to the professional development opportunity). These steps have been essential to understanding how to improve student engagement and to linking this understanding to actual intervention. Going forward, to evaluate the results of its interventions, Carlow can use data from their future administrations to either prove the effectiveness of these actions or to develop new strategies to enhance these measured outcomes.
Longstanding Commitment to Use NSSE Data in Many Ways (Lessons from the Field–Volume 4)
Indiana University–Purdue University Indianapolis (IUPUI) has been administering NSSE since 2002. In its NSSE 2006 results, IUPUI’s first-year students indicated they were less likely than students at peer institutions to report either serious conversations with students different from themselves or to include diverse perspectives in class discussions or writing assignments. These results informed curricular discussions on campus and led to the development of more Themed Learning Communities to create opportunities for students to discuss issues of diversity, inclusion, and equity.
At IUPUI, a Themed Learning Community (TLC) is comprised of a first-year seminar and two or more disciplinary courses in which a group of 25 freshmen co-enroll. Throughout a semester, the TLC group explores a theme, makes integrative connections between courses, and engages in out-of-class experiences guided by the TLC’s faculty team. In 2016, IUPUI had 11 different TLCs focusing on diversity. Also in 2016, TLCs formed a partnership with IUPUI’s Diversity, Enrichment, and Achievement Program from which four new TLCs were created, in 2017, to support the success of students from populations traditionally underrepresented in higher education. To monitor the effectiveness of TLCs in helping students achieve institutional learning goals, IUPUI researchers have used NSSE data. In one report, NSSE items mapped to the institution’s Principles of Undergraduate Learning (PULs) learning outcomes showed that TLC participants had higher scores than nonparticipants along these outcomes (see Figure 6).
In addition to advancing diversity, inclusion, and equity at the institution, IUPUI stakeholders are interested in increasing participation in and measuring the quality of High-Impact Practices (HIPs). One key campus initiative targeting this goal is RISE—Research, International experiences, Service-learning, and Experiential learning—which provides maps for students to enroll in RISE courses and resources for faculty (e.g., taxonomies and funding for course development). To measure the quality of RISE, the Office of Institutional Research and Decision Support uses retention data, follow-up surveys, qualitative interviews, and NSSE data. Triangulated with data from the other sources, NSSE data are used to illuminate the relationship between HIP participation and desired student outcomes. NSSE results have indicated that, among first-year and senior students, RISE participation is related to increases in engagement behaviors associated with Higher-Order Learning and Discussions with Diverse Others.
NSSE data continue to be vital in shaping conversations at IUPUI regarding student engagement and learning. Dynamic reporting from the Office of Institutional Research and Decision Support via Tableau data visualization software allows users to examine student HIP participation by (a) the total number of HIPs completed or (b) participation in a specific HIP (e.g., service-learning, undergraduate research, internships). Users can disaggregate the data by student characteristic (e.g., gender, race/ethnicity, full-time or part-time status) and by school within the university; they can also compare participation rates between IUPUI and peer institutions and other public research universities (see Figure 7) This tool presents a data-rich way to inform educators skeptical of their department’s contribution to low institutional participation numbers, those interested in how they “stack up” with peers, and those who want to ensure equitable HIP participation across different student groups.
Conversations about future initiatives at IUPUI have also drawn on NSSE data. For example, NSSE 2015 results informed discussions at the winter retreat of the nationally recognized IUPUI Center for Service and Learning (CSL). Specifically, discussing IUPUI’s low scores (relative to those of peer institutions) on the Diverse Interactions Engagement Indicator, CSL staff used Design Thinking strategies to better conceptualize how diversity affects their work and how reflection strategies might be used to enhance student development around diversity. CSL staff have also used data from the Deep Approaches to Learning Scales in their scholarly work on the relationship between participation in service-learning and deep approaches to learning. Data like these were used in the 2015 application for the Carnegie Foundation’s Community Engagement Classification, which identified IUPUI as one of the 240 engaged campuses in the US. IUPUI has historically used data to inform the creation of educational interventions, and the institution’s ongoing innovation keeps data alive in present-day conversations about the institution’s future.IUPUI continues to put NSSE results to good use. Over several NSSE administrations, both IUPUI senior and first-year respondents were more likely to indicate that they were working more than 20 hours per week off campus compared to similar students at peer institutions.
As a result, IUPUI plans to remain focused on several initiatives designed to encourage more students to work on campus. In the last 15 years, IUPUI has used NSSE data in comprehensive ways, from measuring achievements related to their PULs to informing needed conversations regarding campus diversity.
Keuka College, an institution with a unique and unmatched emphasis on real-world experience, uses NSSE data to monitor student satisfaction and engagement in key educational experiences. Every year, every undergraduate student at Keuka College completes a Field Period®—a credit- bearing, off-campus learning opportunity that can resemble an internship or may take the form of community service, spiritual exploration, creative endeavor, cultural exploration, or international travel. These experiences are critical, with 94% of the most recently graduating seniors describing Field Period® as important in assisting with their career development and 20% of Field Period® experiences resulting in full-time employment of graduates. A cornerstone of the Keuka College curriculum since 1942, Field Period® is represented in 10% of degree requirements for every undergraduate major. For the college’s first-year students, the First-Year Experience seminar is crucial because it is their first opportunity to learn about the Field Period® process.
While Keuka College has been intentional in supporting its first-year students through traditional methods like orientation and academic advising, institutional stakeholders noticed that NSSE results indicated first-year students reported low quality interactions with students, advisors, faculty, and staff. This finding led to numerous conversations on campus about how best to foster interaction between first-year students and other campus community members, and changes to the curriculum and campus culture were implemented. For example, the first-year experience course was revamped to allow for more opportunity for students to interact with faculty on interesting topics such as Exploration of Multicultural Education, Adventure and Recreation, and Leadership.
In another implemented change, advising and course registration were incorporated into New Student Orientation to encourage engagement with faculty. Additional initiatives are being considered to actively engage students past the first semester through additional revisions of the first-year experience. Also, in the fall of 2016, every incoming student was assigned both a major advisor and a student success advisor, forming a team committed to collaborative and proactive advisement to support each student’s persistence and success. As a participant in NSSE every other year, Keuka College is excited to see if these implemented changes enhance their Quality of Interaction scores.
After several years of modest NSSE data use, North Central College realized a significant increase in interest in student engagement results by a number of campus constituents. Organic conversations about using NSSE data to inform campus practices followed a few transitions in campus leadership and faculty involvement: a new provost, who has encouraged a greater use of partnerships between academic and student affairs; a new director of the Center for the Advancement of Faculty Excellence, who had used engagement research in her own scholarship; a new director of undergraduate research, who has been eager to learn more about the kinds of students engaging in undergraduate research; and faculty who have been increasingly embracing the idea of greater participation in High-Impact Practices (HIPs). Capitalizing on the potential of these transitions, North Central gave new faculty an orientation session that encouraged them to think about their influence on students’ engagement. Facilitated by North Central’s vice president for student affairs and dean of students and a group of student leaders, this session sought to empower faculty to think about small adjustments they could make in their own classrooms and in their interactions with students outside the classroom to increase student engagement. North Central also hosted a similar conversation with student affairs staff—having them look at findings within subpopulations including HIP participation by race/ethnicity and gender and in the aggregate to get an idea of how students were experiencing the institution.
Meanwhile, a strategic planning process has been under way at North Central College, and the college has reflected intentionally on the measures important to this process. Instead of relying solely on college and university rankings for progress benchmarks, institutional leaders have asked to know more about what students actually do. In response, at a presentation to the college’s board of trustees, the vice president for student affairs and dean of students used NSSE data to illustrate student engagement as an indicator of educational quality and to provide the board a view of the college’s performance through its NSSE results and reports. A similar presentation using NSSE data was given to student affairs staff, and the attendees also discussed ways to improve student learning and development with NSSE indicators in mind. North Central continues its efforts to ensure that all campus units know how the construct of student engagement and the data from NSSE can help create successful educational environments for students
Because advisors can direct students to multiple resources and support services to help them along their education pathways, The Ohio State University (Ohio State) believes improvements in academic advising are essential to ensuring that all students flourish and succeed. To establish a baseline from the student perspective for planning these improvements, Ohio State appended the Academic Advising Topical Module to their 2013 NSSE administration. Additionally, advisors’ perceptions about their training and professional development were collected in a survey administered in 2014 by Ohio State. Enhancing Academic Advising, Ohio State’s 2014 Higher Learning Commission quality initiative, defined the framework for the improvement effort: “Academic advising requires a collaborative relationship between advisors and students—an active, sustained, and intentional process, rather than passive, sporadic, and casual contacts.” This initiative implemented programs focused on advancing advising to the next level through the following ongoing activities and offerings:
Training and professional development for advisors
Assessment of academic advising learning outcomes
Increased advisor accessibility to and engagement with information to guide and support students
Enhanced collaboration between advisors and other university offices
To assess the effectiveness of their academic advising quality initiative, Ohio State re-administered the NSSE advising module in 2016. Comparing data from both administrations, the university found a number of areas in which student responses in 2016 were significantly more positive than in 2013 and no areas in which responses were more negative. Both first-year and senior students responded more positively in 2016 when asked to what extent their advisors helped students understand academic rules and policies and informed students of academic support options (tutoring, study groups, help with writing, etc.). For seniors specifically, Ohio State saw increases in the number of students who said their advisors had been available when needed and listened closely to concerns and questions.
These findings indicate that the ongoing work of Ohio State’s quality initiative to enhance academic advising is having a positive impact—which supports the continuation and expansion of this work. Further, Ohio State intends to share these findings to boost advisors’ morale, to raise their campus profile, and to promote partnerships with them across campus.
Like many institutions, Oklahoma State University (OSU) is challenged by decentralization. This has complicated its efforts to disseminate NSSE data and reports and to implement change. In the past, although the university’s assessment office provided an executive summary report of NSSE results to various offices and academic colleges, this report was not consistently helpful because its broad findings were not specific to the units’ various needs and students.
The assessment office has since prioritized providing each unit with data pertinent to that unit’s work and the students it serves. The office has also developed resources to make data more accessible to faculty and staff across campus, including a new internal OSU website, dedicated to data and reports, that provides links to NSSE resources and information on accessing the NSSE Report Builder.
Getting faculty more invested in using NSSE results has also been a priority at OSU. In this effort, the assessment office has made it easier for faculty to access NSSE data for their own research endeavors. For example, two faculty members are comparing the engagement levels of in-state students who received need-based state-sponsored scholarships and those who did not.
Additionally, working with the Institute for Teaching and Learning Excellence (ITLE), the assessment office has helped inform faculty workshops on using NSSE results. In a meeting with the ITLE’s support unit of instructional designers and various faculty members, for example, the assessment office provided a two-and-a-half-hour presentation on the implications of NSSE findings for faculty practice at OSU. The presentation included an overview of NSSE, information about the university’s recent NSSE response rates and respondent demographics, details about OSU’s selected comparison groups, and descriptions of areas of strength and areas for potential improvement. The presentation also included findings from Topical Modules and from BCSSE. The goal of the presentation was to identify what faculty were doing in their classrooms related to student engagement and what they could do to enhance it. One critical area of faculty practice that was identified correlates with NSSE’s Higher-Order Learning Engagement Indicator.
Since that presentation to ITLE, enhancing students’ higher-order learning across campus has become a focus at OSU. For example, among the newly developed ITLE faculty courses, which are hybrid in-person and online workshops, one of the courses focuses on more thoughtfully matching student needs with teaching methods; more deeply engaging students in content through activities that highlight analysis, application, and evaluation skills; and more closely aligning content assessments to teaching practices so that evaluation is more relevant and reliable. As evidence of this ITLE course’s impact, a chemical engineering faculty member who completed the course has converted his lecture-based course into a course incorporating guided problem-solving tasks with embedded informal, formative assessments that allow him to gauge student learning immediately and to make adjustments where necessary.
OSU is committed to finding new uses of NSSE data and to reaching a broader range of faculty with college-specific resources and support.
In 2014, Rose-Hulman Institute of Technology received a grant from the Kern Family Foundation as part of the Kern Entrepreneurial Engineering Network to develop for engineering students entrepreneurial minded learning (EML) opportunities that foster an entrepreneurial mindset and enterprising attitudes. Rose-Hulman President James Conwell said that the grant—combined with the goals and mission of the institution—would play an important role in preparing graduates to positively contribute to the American workforce. This grant has supported a number of educational initiatives at Rose-Hulman, such as engaging faculty in multidisciplinary groups to create EML-infused courses in each academic discipline, including the humanities and social sciences. Rose-Hulman has also developed a new living-learning community, the Engineering Student Community Actively Learning Advanced Technical Entrepreneurship (ESCALATE), in which 50 first-year students who live and take courses together are connected to student and alumni mentors.
To assess the impact of their efforts to infuse EML initiatives throughout the institution both in and outside the classroom, in their NSSE 2015 administration, Rose-Hulman appended the First-Year Experiences and Senior Transitions Topical Module. A number of items in this module were identified as having the potential to measure progress toward EML goals—particularly, in the senior students’ section of the module, the items related to entrepreneurial skills, self-employment, and starting your own business. The module findings are serving as benchmarks as Rose-Hulman extends EML initiatives across the institution, with plans to readminister the module in 2018.
Rather than wait for the 2018 data for longitudinal comparisons, however, Rose-Hulman chose to use the existing data to examine what was already happening on their campus. Results from the module’s first-year experience section, for example, gave insight into the impact of College and Life Skills—a course designed to help first-year students make a smooth transition from high school and to introduce them to important resources and individuals at Rose-Hulman. Compared to first-year students at peer institutions, the results showed that Rose-Hulman students were much more likely to seek additional information for course assignments when they didn’t understand the material and to ask instructors for help when they struggled with course assignments.
Rose-Hulman has also been working to be more intentional in how data are shared across campus. For example, to address some challenges in using the Major Field Report as a small institution with most students in engineering programs, Rose-Hulman used the Report Builder–Institution Version to break down the findings by specific engineering majors. Each academic program received its own individualized report including institution-wide findings, departmental findings, departmental comparisons to other U.S. institutions, and data use resources. Supporting greater use of NSSE results at the program-level and outlining a plan to employ student engagement results to monitor the infusion of EML have been effective approaches for making data use more widespread at Rose-Hulman.
Every year since its inauguration in 2007, the Beginning College Survey of Student Engagement (BCSSE) has been administered at Southern Connecticut State University during orientation, and the institution has been pushing the boundaries of how colleges and universities use BCSSE data. As part of the First-Year Experience (FYE) Program, all incoming students are enrolled in a seminar that promotes their academic habits of mind, research skills, and preparedness for more advanced coursework. This seminar extends students’ orientation into the future and guides them in developing action steps in the here-and-now to achieve their desired futures.
Prior to the first day of classes, the FYE seminar instructors receive a BCSSE Student Advising Report for each student, which provides individualized information regarding a student’s commitment to the institution, expected academic difficulty, and self-perception of academic preparation for college. When guiding faculty on how to use this information to gauge a student’s confidence and needs, the Office of Assessment and Planning emphasizes that, rather than spelling out a student’s destiny, BCSSE data provide a roadmap on how best to support the student during this crucial transition. At Southern Connecticut State University, the focus is on that which is amenable to change rather than unchangeable demographic characteristics and prior learning.
The Student Success Task Force, chaired by the Dean of the School of Arts and Sciences and the Vice President for Student Affairs, used BCSSE data along with other sources of data in predictive modeling to identify the most important predictors of student academic learning, persistence, and graduation outcomes. Of the information collected by BCSSE, the item “Do you expect to graduate from this institution?” was a significant predictor; not surprisingly, students who responded “Uncertain” were less likely to be retained compared with students who answered in the affirmative. Other important predictors included students’ expected difficulty with time management; preparedness to speak clearly and effectively; and frequency of talking with a counselor, teacher, or other staff member about university or career plans.
More than this, the results of the predictive models using BCSSE data indicated that student success is all about relationships. The Student Success Task Force’s recommendations led to the creation of the Academic Success Center and the modification of academic programs, policies, and instruction as part of a drive to advance a culture of student-centeredness at the university. Specifically to help students plan for the cost of education and manage their financial obligations, a new position was created: Coordinator of Student Financial Literacy and Advising.
BCSSE and NSSE data have been used at Southern Connecticut State University in numerous other ways as well. For example, using data from NSSE’s Academic Advising Topical Module (along with other sources of information) to identify issues with the campus’s advising practices, the institution implemented the Education Advisory Board’s Student Success Collaborative advising platform, and university staff continue to use data from the advising module to evaluate this initiative. Additionally, analysis of BCSSE and NSSE data trends conducted by the Office of Assessment and Planning underscored the importance of paying attention to the specific needs of students who are the first in their families to attend college.
One outcome from this analysis was the implementation of a special High-Impact Practice offering, First-Generation College Student Living and Learning Communities, whose students are enrolled together in focused FYE seminars and live together in dorms and with staff members who themselves had been first-generation college students. This program has had real success. First-generation students who participated in this High-Impact Practice had the highest score on the NSSE item measuring students’ overall evaluation of their entire educational experience at the institution, and they were almost 10% more likely than their nonparticipating counterparts to persist at the institution.
Southern Connecticut State University is currently considering the factors that promote and impede on-time graduation. The most important predictors of on-time graduation include the characteristics of the students’ incoming profile, the students’ goal-directed activities, their confidence that they would seek and identify additional resources to better understand course-related materials, and their expected difficulty in getting help if they are struggling with coursework. Results from BCSSE and NSSE can provide data illuminating these predictors.
Overall, BCSSE and NSSE results inform important conversations at Southern Connecticut State University about the most effective ways to promote students’ learning and development. Infographics depicting key survey findings and important predictors of student success are used to spark discussions during meetings. BCSSE and NSSE data highlight areas in which the university has scored higher than its peer institutions—particularly in the Discussions with Diverse Others and Student-Faculty Interaction Engagement Indicators—and the data also identify areas in need of improvement. BCSSE and NSSE results contribute to the university’s data-driven process of educational change and, in response, the university changes the way it works on behalf of students.
At St. Olaf College, NSSE data are woven into the decision making of the Board of Regents, the President’s Leadership Team, the Academic Leadership Team, the Curriculum Committee, and the Provost. For example, NSSE items and Engagement Indicators are incorporated into the Board of Regents Community Life Committee metrics for campus diversity, student wellbeing, and student engagement. These data are triangulated with other sources of information such as the St. Olaf Student Information System, the National College Health Assessment, and the St. Olaf Learning Goals Questionnaire. Beyond establishing reliable metrics, mapping different sources of data to desired goals allows the committee to more strongly align these goals with the St. Olaf College president’s vision and to identify important areas where data are not currently being collected. Additionally, NSSE data have recently been used by St. Olaf staff (a) to inform a particular line of decision making within the institution and (b) to analyze data collected previously to answer constituents’ questions. NSSE results are also used to communicate institutional achievements to the public. For example, on St. Olaf’s institutional learning outcomes website, StOGoals, NSSE data are used to show evidence for Insightful Integration and Application of Learning and Intentional and Holistic Self-Development. After a St. Olaf College NSSE administration a number of years ago, the college’s Institutional Research and Effectiveness office conducted student focus groups to examine the institution’s survey responses. Among the concerning issues that emerged from these focus groups was students’ uncertainty about formal and informal advising and the different types of encounters with each. A task force was convened to evaluate the academic advising received by St. Olaf’s students.
Expanding the institution’s data collection on this issue, in its next NSSE administration, St. Olaf used the Academic Advising Topical Module, enabling the comparison of St. Olaf’s academic advising efforts with those of participating peer institutions. The resulting information gathered through the focus groups and NSSE, as well as other surveys conducted on campus and with alumni, guided the restructuring of St. Olaf’s new academic advising office and also informed the hiring process for a new director of that office. In summary, to address an emergent issue in its academic advising, St. Olaf took a specific course of action—reworking academic advising—and gathered high-quality information to carry out that action successfully.
The Institutional Research and Effectiveness office has also used NSSE data to answer questions posed by the Board of Regents about the quality of the St. Olaf student experience. In one instance, board members were curious about how St. Olaf students would score in areas measured on the Gallup-Purdue Index, a national study linking college student success to high-impact experiences such as internships and extracurricular activities. Although St. Olaf had not participated in this study, the Institutional Research and Effectiveness office was able to answer the board’s question by leveraging data already collected through the NSSE survey and the Higher Education Data Sharing (HEDS) Consortium alumni survey—for example, to provide the percentage of St. Olaf seniors who participated in an internship, co-op, field experience, student teaching, or clinical placement, and, further, to contextualize this percentage by providing comparison group data. It is not uncommon for members of St. Olaf’s Board of Regents to read about trends in higher education and to wonder, “How well are we doing?” With extensive high-quality data on hand—along with the knowledge of how to weave these data into decision making—St. Olaf’s institutional research office is prepared to answer.
The University of Hawai'i at Mānoa exemplifies how investing in student buy-in to raise response rates and creating innovative tools to inform and engage users enable an institution to get the most out of its National Survey of Student Engagement (NSSE) data. The Mānoa Institutional Research Office (MIRO) shifted from its past supporting role in producing NSSE reports to a proactive role in leading campus efforts to improve the NSSE response rate. Its focus now is on efforts to use NSSE data to support improvements in key areas. As part of MIRO’s follow-up research and the creation of an action plan, a cross-functional team will attend the second annual National Institute for Teaching and Learning, where participating campus teams will develop evidence-based action plans aimed at improving instructional practices, student engagement, and student learning and success. In an attempt to get NSSE data into the hands of those who can use the information to inform decision making, MIRO has reached out to campus units through strategies such as customized reports, online interactive data reporting tools, video tutorials, and face-to-face discussions and training.
For NSSE 2015, MIRO carried out a comprehensive marketing strategy that included several key steps to promote survey participation among first-year and senior students. First, the office coordinated campus-wide advertisements for the survey on dozens of banners and boards as well as hundreds of flyers in first-year and senior residence halls. Second, based on research on the relationship between the use of incentives and increases in response rates, rewards for participants were offered through a drawing for prizes such as an iPad Air 2 and 20 bookstore gift cards. Third, student resources with information about the survey were provided, including a landing page on the MIRO website featuring frequently asked questions such as: “What is the National Survey of Student Engagement (NSSE)?” and “Why should I take part?” The office also organized visits to some of the largest first-year classes to present information about the survey and to encourage participation.
On days when the survey was being administered, information tables were staffed around campus—a service coordinated by student members of the American Marketing Association. With the currency of social exchange as the guiding principle, students were offered snacks and pens with NSSE information notes as they gained awareness about the survey. Prior to the survey administration, MIRO presented its marketing plans to the academic deans, who in turn supported the effort by advertising the survey in their buildings, hosting survey administration parties, doing their own tabling for the survey, and encouraging faculty to promote the survey. Finally, during administration, advertisements were updated to include the end date of the survey period, thereby reminding students to complete the survey before the deadline. To better understand the effectiveness of those promotion strategies, MIRO entered survey response rate data on a daily basis and used the NSSE interface to track changes in response rates.
It was clear that the efforts put forth by MIRO paid off. Compared to the 2011 administration of NSSE at the University of Hawai'i at Mānoa, response rates for NSSE 2015 doubled from 16% to 32%. Closing the loop on this project, MIRO posted an online video showing the steps taken to improve survey participation and the university’s favorable response rate compared with those of other institutions. MIRO also compared NSSE responses with enrollment data to demonstrate that the survey sample adequately represented the overall student population along the characteristics of class standing, gender and race. This final comparison can (a) persuade skeptics of the representativeness of information derived from NSSE and (b) provide strong evidence of the success of campus partners in promoting the survey. These efforts complement other efforts of MIRO to expand access to NSSE data.
MIRO has also created innovative ways to disseminate NSSE findings to different academic units and offices on campus to enhance their capacity for data-based decision making. Outreach efforts include developing interactive data tools to help departments and academic units access NSSE data and conduct data mining in ways that answer specific questions about student engagement. Central to the design and functionality of the web apps that MIRO developed for NSSE data is the ability to “slice and dice” the data based on one or more variables (e.g., gender, race/ethnicity, college, department and many others). The visually appealing report designs enable users to quickly identify data trends. The office also created customized presentations and video reports for student affairs and academic affairs units to focus on three aspects of student engagement: supportive environment, diverse perspectives and student accountability. In addition, MIRO hosted face-to-face training sessions on how to use NSSE data (eight sessions in one semester) and developed virtual tools that include video tutorials, scenarios for use, and follow-up surveys. These tools and data sharing strategies have garnered positive feedback from various offices on campus. By placing data into users’ hands, creative ways of using data to drive decisions have become possible.
To gain a better understanding of one of the areas identified for improvement, MIRO administered a follow-up survey, in July 2016, consisting of five open-ended questions looking at different perspectives of the University of Hawai'i at Mānoa’s supportive environment. Nearly 1,800 students responded, generating nearly 9,000 total responses. MIRO created an interactive online reporting tool allowing decision makers to quickly locate students’ responses from different student populations on specific issues and campus services. These qualitative results provided critical and meaningful information from student voices. To generate real campus change using NSSE results, in August 2016, MIRO’s director led a cross-functional Mānoa team at the National Institute for Teaching and Learning, where they used data from NSSE and the supportive environment survey to develop an action plan to enhance the university’s supportive environment for student success.
All of these efforts to put NSSE data into users’ hands and to link data with program improvements provides the Mānoa community with a better understanding and appreciation of the importance and usefulness of NSSE results. With increased awareness, the University of Hawai'i at Mānoa is likely to enjoy an even more desirable NSSE response rate in the next administration period, which will bring more NSSE data to use for campus decision makers. This healthy and sustainable process works and can be replicated at other institutions.
While recognizing that individual units are in charge of making changes in their educational practice and policy, the institutional research office at the University of Hawai'i at Mānoa serves as an excellent resource in assuring these decisions are informed by NSSE data. Its investment in both the participation and the data use aspects of survey research provides a blueprint for how users can maximize NSSE data to better serve their students.
Dr. Sharon M. Bailey, the Director of Institutional Research and Effectiveness at the University of Houston–Victoria, invented an undergraduate research project with the goal of disseminating NSSE results to the greater campus. For this project, two work-study students devised a creative, interactive approach to sharing results. They designed, printed, and folded paper fortune tellers, a form of origami used in children’s games, with the institution’s BCSSE, NSSE, and FSSE results, making several different versions of the fortune tellers based on different research questions. One showed the average number of hours students work for pay (from NSSE) and faculty perceptions of student time working for pay (from FSSE), providing a conversation-starter about the differences between actual student behavior and faculty perceptions of student behavior. Another compared students’ expectations about how often they would go to class unprepared (BCSSE) with the percentage of freshmen and seniors who reported going to class unprepared (NSSE).
Using the University of Houston–Victoria’s NSSE and FSSE reports, the students assigned to this project developed key research skills such as collaborating with peers to coordinate the project and interpret the data and effective reporting skills such as identifying important information, reviewing data for accuracy, and tailoring data to a particular audience.
In the guidebook these student researchers developed to help others create similar paper fortune tellers, they wrote, “Look for pieces of data from each survey that would go good together. Make sure the data used would appeal to targeted audience.” The experiences of these students engaged them in communication, quantitative reasoning, and teamwork. Developing this advertisement for NSSE data required creative energy, and the students were proud of their results.
In the end, these two students designed 14 distinct paper fortune tellers with facts from survey results, and they printed and folded more than 300. The fortune tellers were placed on tables in the student cafeteria and at the faculty and staff appreciation lunch, and the extras were used by the alumni office.
The project was successful in getting campus partners who otherwise might not be familiar with NSSE data to actually see some of the results in their hands. Even a year later, Dr. Bailey ran into faculty and staff who remembered the paper fortune tellers and, in spring 2017, she reprised the project, this time with a message encouraging faculty to participate in FSSE.
University of Mary Washington’s (UMW) 2013 Quality Enhancement Plan (QEP)—“UMW’s First-Year Seminar: Research, Write, Speak”—was developed to enhance the first-year seminar experience. UMW had established the First-Year Seminar (FSEM) requirement, in 2008, based on NSSE results indicating lower levels of engagement among first-year students. Designed as a three-credit course and featuring a student/faculty ratio of 15:1, FSEM focused on building a skill set for success in a rigorous academic environment to be learned in a content-driven context of mutual interest to the students and faculty. Topics of these FSEM courses have included Game Theory, Making a Difference, and Race and Revolution.
Since the creation of the required course, student learning at UMW has been monitored via institutional surveys and data, along with NSSE findings. Continued evidence indicated that FSEM could be improved, and this became the focus of the Quality Enhancement Plan advanced as part of UMW’s 2013 reaffirmation of accreditation by the Southern Association of Colleges and Schools Commission on Colleges. For example, results from NSSE 2010 and 2012 indicated that, in most cases, UMW firstyear students perceived that their institution contributed less in the areas of writing and integration of ideas compared with first-year students at peer institutions. Other institutional data, such as surveys of admitted students and graduating seniors, corroborated these findings.
In response to the concerns about student writing, the faculty-authored QEP established uniform and measurable learning outcomes for all FSEM courses including, “Improve development and organization of written arguments” and “Demonstrate the ability to edit and revise in the writing process.” Also, under the direction of the QEP office, staff in UMW’s academic learning centers (writing, speaking, and library) developed online learning modules to support student development in the areas of information literacy, writing proficiency, and oral communication. Instruments to measure student learning across FSEM courses included embedded assessments of core learning modules (information literacy, writing, and speaking) and standardized rubrics. Lastly, UMW identified resources to support faculty development in adopting course learning outcomes, incorporating online learning modules, and implementing assessment tools such as rubrics to evaluate student work. To further support this ambitious initiative, the institution made FSEM a premier experience for first-year students by moving all seminars to the fall semester, having the first-year seminar instructor serve as the student’s first-year academic advisor, and attaching a learning community based on the student’s FSEM course assignment. As a result, almost all first-year students live in a residence hall community built around the FSEM course. Results following these changes show increases in student GPA and retention.
University of Mary Washington has a culture of positive restlessness—continually looking for ways to improve the student experience and monitor interventions. As a demonstration of this culture, UMW participated in the NSSE Academic Advising Topical Module in 2014 and 2016, and results indicate strong improvement in advising experiences since involving faculty as advisors for first-year students. Future UMW improvement efforts will include examining changes in behaviors related to the Academic Challenge Engagement Indicator and increasing opportunities for faculty to speak with students regarding career plans.
Every summer and January, in preparation for the upcoming term, the University of Minnesota Duluth (UMD) Division of Student Life holds a retreat for the division directors on a topic both related to the goals of the institution and applicable to the work of the division’s departments and programs—student activities, recreational outdoor sports, student conduct, housing and residence life, diversity and inclusion, and others. In 2016, the retreat included a common reading of Diverse Millennial Students in College: Implications for Faculty and Student Affairs (Bonner, Marbley, & Howard-Hamilton, 2011) and conversations about how these implications related to students at UMD, identifying areas where UMD was successfully meeting the needs of its various student populations and where it might be having some difficulty. Infused into these conversations were UMD’s NSSE 2014 results—with a focus on data related to retention and student success, particularly for students of color. Important findings included the following: UMD’s first-year students of color rated their interaction with staff lower than did their peers at other institutions; first-year female students were more likely to utilize academic support resources than were their male peers; and senior students of color had more outside responsibilities (work, family, etc.) potentially impacting their ability to manage academic commitments than did their peers in other racial-ethnic groups.
These data elicited a number of questions: Is the Division of Student Life doing all it can to proactively meet the needs of diverse populations of students? How should the division retool its approach to recruiting and retaining students of color? Further, how is the division creating a positive environment for all students? At UMD, these questions are being actively considered as the campus’s student population grows increasingly diverse.
With a recent plateau in enrollment after a decade of steady enrollment growth, UMD recognizes that it cannot wait for students to tell the institution what they need. Instead, UMD must adjust its practices to provide a high-quality equitable experience for all students on campus. Moving forward, UMD plans to continue infusing NSSE data into its campus conversations, with the intention of making evidence-based decisions to improve practice.
In an effort to reimagine how NSSE data are shared by distilling actionable and tangible findings from the survey, the Office of Academic Affairs at the University of Nebraska–Lincoln (UNL) reorganized the institution’s NSSE data into four UNL Brief Reports: one for instructors; one for student support services; and one each with results from the Global Learning and the Experiences with Diverse Perspectives Topical Modules. Each report contains an overview of NSSE, a description of strengths within the institution, identified areas for improvement, and a conclusion. What makes this effort so innovative is how these reports are specialized for each audience.
The report for instructors includes findings related to teaching such as (a) engagement indicators regarding student-faculty interaction, effective teaching, and quality of interactions; (b) student behaviors related to reading and writing; and (c) the degree to which students engage in discussions with diverse others or perceive the campus environment as supportive. Among the highlighted strengths in results for the three Engagement Indicators are the higher means for UNL seniors compared to seniors from other Big Ten and Regents institutions.
The instructors’ report also includes “A Closer Look,” a section in which UNL’s item-level successes are detailed (for example, the 6% increase in first-year students who reported “talking about career plans with a faculty member,” compared with first-year students at peer institutions). The report provides a similar granular look at the areas for improvement, as first-year students reported significantly lower levels related to whether or not instructors (a) clearly explained course goals and requirements, (b) taught course sessions in an organized way, and (c) used examples or illustrations to explain difficult points.
This specialized report is important for instructors, for whom assessment of student learning is only part of their job. For them, relating NSSE information to their work can be overwhelming due to the massive amount of data presented in the institutional report. By detailing the ways faculty are succeeding or could better enhance the students’ educational experience, the report presents data in a digestible format featuring only the most useful information. Furthermore, this strategy creates specialized tools for the academic affairs staff to use when working with either staff or faculty. The report for instructors and the other UNL specialized reports are excellent demonstrations of assessment experts lowering barriers between data and those who can act on data.
Information literacy has become a growing priority and a new core competency for the University of San Diego (USD), where it is recognized as a student learning outcome spanning all disciplines and critical to the success of all USD graduates. Information literacy is also emphasized in the Western Association of Schools and Colleges Senior College and University Commission (WSCUC) accreditation standards as a core competency that prepares students for future careers and life-long learning.
Therefore, in 2011, USD began core revision work for assessing students’ progress in this area. The first step consisted of assessing the baseline level of students’ information literacy skills. Subsequent pilot interventions sought faculty volunteers who worked closely with a librarian in an effort to demonstrate how various disciplines could incorporate information literacy into their courses. To raise faculty awareness of the need for information literacy training, these faculty-librarian teams assessed the strategies employed during the pilot stage.
In 2015, USD appended the Experiences with Information Literacy Topical Module to their NSSE administration. The module’s findings served two main purposes. First, they represented a baseline for how students perceived information literacy and responded to the institution’s prioritizing of information literacy. Second, the findings could be used to encourage faculty and staff across the institution to recognize the importance of focusing on this area. For example, one module finding was that many first-year students did not perceive that key information literacy outcomes or skills were embedded in their courses. USD considered this problematic, as students should be developing these skills in all of their courses.
As follow-up interventions, USD librarians developed a set of curricular offerings to help faculty and their students acquire information literacy skills; USD core curriculum faculty incorporated the teaching of information literacy skills into the historical inquiry requirement; and, specifically to address the development of these skills in the first year, USD hired a writing director to work closely with librarians to ensure information literacy becomes a core piece of the first-year experience.
USD is also working with faculty to explicitly deepen students’ awareness of the importance of gaining information literacy skills. For example, a faculty member in engineering identifies and describes information literacy skills to students as a part of the course and carves out time to articulate to students what to focus on to gain these skills by completing course assignments (e.g., research paper). When these connections are made explicit, students appear to be more engaged in the learning process. It is important to USD that faculty members as well as staff across the institution are involved—making increasing students’ skills truly an institutional effort.
USD plans to implement the Experiences with Information Literacy Topical Module again to monitor progress in student awareness since designating information literacy an institutional core value. These results will also be incorporated into future accreditation reports in the discussion of WSCUC’s five core competencies. The evidence gleaned from this NSSE module will strengthen USD’s goal to equip students with the knowledge and skills foundational to 21st-century higher education.
Communications, assessment, and senior leadership from the Division of Student Life at the University of Toronto (U of T) seek to share information on the success and influence of the university’s educational programs. Although increasing student participation in High-Impact Practices (HIPs) is a major goal for the institution, presenting data in a way that inspires interest and change among educational units has been challenging. Through new, compelling data visualization techniques, however, NSSE data have been used to show the relationship between participation in HIPs with student satisfaction and engagement and to generate interest in and conversation about HIPs across campus.
Figure 10 links HIP participation to responses to the survey question, “If you could start over again, would you go to the same institution you are now attending?” Results indicated a small increase in affirmative responses among first-year students who participated in one HIP (5%). However, the affirmative increase among seniors who participated in at least two HIPs was substantial (18%) compared to seniors who participated in none. Simply put, seniors who participated in at least two HIPs were more satisfied with their university than those who did not participate at all in HIPs. Reimagining these data in a new, succinct display allows educators to clearly understand this relationship and creates an enticing narrative for stakeholders to articulate the value of these educational programs.
Figure 11 displays more detailed differences in engagement between seniors who participated in at least two HIPs and those who did not participate in a HIP. The results of this analysis indicate increases in each of the ten NSSE Engagement Indicators for seniors who participated in HIPs, particularly in areas of student-faculty interaction, collaborative learning, and quantitative reasoning. The layout of this display is easy to grasp and clearly communicates the message that students who participate in HIPs are more engaged than those who do not. Also, the image allows the viewer to easily understand the degree to which HIP participation increases student engagement for each of the indicators. This neat and simple graphic of a seemingly complex relationship clarifies a key point: Students who participate in HIPs are more engaged.
With design support from their communications team, senior leaders in the Division of Student Life at the University of Toronto have shared these visualizations and data across the university—with individual departments, faculty members, registrar staff, librarians, and student life staff. As a result of these new data formats, campus conversations about the implementation of HIPs have grown broad and deep. The visualization of these data present a robust case for the importance of HIPs, moving educators past “Why do them?” to “How can we best do them?”
William Jewell College is an intimate college, in Liberty, MO, and a longtime participant in NSSE and BCSSE. Because of this long-standing commitment to the surveys, components of these instruments have been embedded in discussions about the curriculum, improving instructional practice, and in advising discussions. For example, stagnant senior scores on NSSE’s Academic Challenge Engagement Indicator led to an initiative to disaggregate these data by major and to have conversations with academic departments to raise their awareness of the survey results. These conversations stimulated course-level adjustments within departments that resulted, in subsequent surveys, in seniors in most programs reporting higher scores on the Academic Challenge indicator.
The institution also uses BCSSE scores to facilitate relationship-building between the academic advisor and first-year advisee by asking them to discuss the differences in the first-year student’s expectation of the college experience and their high school experience and behaviors, corresponding with items on the BCSSE instrument. At William Jewell College, as at many similar institutions, faculty serve as advisors for students. BCSSE information provides guidance for faculty on how to best support students. Advisors are asked to pay particular attention to students who plan to spend less than 15 hours a week studying, more than 10 hours a week working, or more than 10 hours a week participating in co-curricular activities. They are also asked to pay attention to low self-ratings within the sections of Expected Transition Difficulty or Academic Perseverance.
William Jewell College also exemplifies how to use NSSE Topical Module data to guide curriculum development and resource allocation. Stakeholders at the institution leveraged data from the Experiences with Diverse Perspectives module to enhance a ten-year plan to increase campus structural diversity and interactions around diverse topics and to be more inclusive. Results from years of collecting data from this module indicated that students at the institution were less likely than the institution desired to engage in activities or to participate in conversations regarding societal differences. Although comparison showed that the institution’s data were similar to the data of peer institutions, the college aimed for even better performance. As a result, the faculty approved adding a required common course on identity and society for all new students (starting Fall 2017) and requiring those students by the time of graduation to complete two approved diversity and inclusion courses (at least six credits), one on diversity in the US and the other on global diversity.
The college also administered the Learning with Technology module and found changing levels of technology use in high school reported by first-year students over the last few years. This information informs on-going changes in how the institution integrates digital resources, leverages a digital commons, and maximizes its one-on-one mobile initiative that provides all students an iPad. Educationally effective technology use has become so ubiquitous at the institution that Apple has designated William Jewell College a Distinguished School for its “innovation, leadership, and educational excellence.”
Lessons from the Field—Volume 3: Using Data to Catalyze Change on Campus
The Ohio State University
Creative Survey Promotion
Anderson University (AU), a private university in Anderson, Indiana, has participated in six NSSE administrations, including the 2015 administration. In previous NSSE administrations, the campus offered incentives for NSSE participation, like a drawing for iPods and gift certificates. Their response rate was satisfactory, but not at the level the campus would have liked. So, in response to this, for the 2012 NSSE administration, the campus decided to take a new approach to survey incentives.
As they prepared for their 2012 administration, like many campuses, AU had already experienced budget cuts and so did not have much available spending for NSSE incentives. After some creative thinking, the campus decided to draw on their values as a smaller institution. With just over 2,000 undergraduate students, the campus embraces personal connections and relationships, so they decided to take a personalized approach to their NSSE incentive prizes. The revamped, personalized incentives were a huge success in 2012, with AU achieving a 62% response rate.
In preparation for their 2015 administration, the director of campus assessment decided to take the same approach. She reached out across the entire campus to solicit donations for incentives. The goal was to have every campus department and office participate and donate. Many offices committed to donate baskets of baked goods, breads, or cookies, which could be offered as incentives to the students. A faculty member who is a black belt in karate donated complementary karate lessons as a prize. A gourmet chef on campus donated a custom-prepared meal at the chef’s home for that student and a number of friends. Parking services donated a parking pass for the next term. Another prize was a personal “cake” day with an administrative department. Prizes were promoted on posters (see Figure 10) and were awarded weekly, at the required chapel on campus, via a drawing of students who have completed the survey. In addition to being less expensive than gift certificates and technology devices, these more personalized prizes emphasize the value of relationships engrained in the AU culture. Anderson’s quirky, customized incentives contributed to the institution’s high response rate.
Using Results to Improve Educational Practice
Bethel University, an evangelical Christian university in St. Paul, Minnesota, has participated in 10 NSSE administrations. NSSE results have been used in various ways at Bethel, including to provide evidence of students’ active learning and senior students’ satisfaction with their educational experience and to promote innovative instructional practice.
After Bethel’s participation in NSSE 2013, the campus prepared two reports. The first report contextualized NSSE by explaining what the results mean, comparing the 2013 results with results from Bethel’s 2010 administration and identifying areas of strength. The second report, using first-year and senior NSSE responses, identified four themes for campus improvement and listed specific actions that faculty could take to improve their results, specifically targeting the existing campus initiative to improve student writing and integrative thinking. These two reports were shared with campus faculty and administrators at various department, committee, and staff meetings. Examples from both follow.
NSSE 2010 Bethel student responses to “received feedback from an instructor on a draft or a work in progress before submitting the final product” caught faculty attention, and they flagged this item as a target area for improvement. Informal follow up revealed that students expected feedback from faculty as quickly on a five-question quiz as they did on a 20-page research paper. Recognizing this as opportunity to improve, the Program Assessment Committee connected with the Faculty Development Committee to create workshops to address the issue of timely feedback. The committees collaborated to send an email to the all-faculty listserve with steps for how to discuss with students expectations for feedback on different assignments. To go further, the Faculty Development Committee developed a workshop on “how to give helpful feedback to students.” For both first-year and senior students, Bethel’s 2013 NSSE results on this item improved from their 2010 NSSE results, and in 2013 their students’ responses on this item were more similar to those of comparison groups.
Bethel’s NSSE 2013 results revealed that both first-year students and seniors were significantly less likely to say they worked harder than they thought they could to meet an instructor’s expectations compared to students at colleges and universities across the country. Additionally, seniors were significantly less likely to say that their courses during the current school year challenged them to do their best work compared with seniors nationwide. The NSSE results further revealed that both first-year and senior students were much less likely to say they carefully revised the content or organization of a paper or assignment before turning it in compared to students nationwide. In response to these results, faculty were encouraged to consider their current expectations for students and how challenge could play a greater role in the classroom experience.
High-Impact Practices (HIPs) were highlighted as an area of great strength in Bethel’s NSSE 2013 results, with 89% of Bethel seniors participating in at least two HIPs, including service-learning, research with faculty, internship or field experience, learning communities, and culminating senior experiences. The importance of HIPs reinforced Bethel’s commitment to building in a culminating senior experience into almost all majors. Moreover, compared to seniors at other institutions, Bethel was pleased to find that, compared to seniors at other institutions, its seniors write more during a school year, spend more time studying per week, and participate more in co-curricular activities. Bethel’s data use example illustrates how NSSE results can inform faculty development agendas and, more important, can influence improvements in teaching and learning. Through their future NSSE results, Bethel plans to continue monitoring their progress in addressing their areas for improvement and sustaining their strengths.
Creative Survey Promotion
With approximately 15,000 undergraduate students, Boston University’s overall response rate for NSSE 2014 was 59%, exceptionally high for an institution of its size (the average response rate among similar size institutions in 2014 was 22%). Boston University (BU) attributes its high response rate to (1) marketing and communication efforts, (2) a convenient and guaranteed incentive, (3) collaborative efforts across campus, and (4) BU students’ desire to provide constructive feedback.
To plan its NSSE administration, BU formed a collaborative, interdepartmental committee with members from student affairs, residence life, student life, the provost’s office, institutional research, marketing and communications, and the faculty. Based on both previous survey experience and recommendations from the committee, BU personnel decided to promote NSSE extensively through multiple mediums, including posters, table tents, mailbox stuffers, signs on shuttle buses, newspaper articles, tweets from the office of the dean of students, and in-class announcements. Marketing efforts began prior to the survey launch and were sustained throughout the administration. Student leaders were also a part of the promotion, as resident assistants kept their respective communities updated with current response rate information. Additionally, all students who completed the NSSE survey were provided a $5 credit on their campus card. BU faculty and staff invested time, effort, and resources in their NSSE administration, and it clearly paid dividends in a high response rate.
With the support of a new president, BU administrators sought to push the institution into the next tier of student engagement. A team of professionals from marketing and communication, residence life, the dean of students office, individual colleges, and the institutional research office reviewed BU’s retention rates and found them lower than desired. This team, the campus’s Student Engagement and Retention Group, identified NSSE as a way to benchmark student engagement within individual colleges, particularly around advising. As the main sponsor behind BU’s first NSSE administration, the team reviewed the results first and immediately created a plan to share the data widely with the provost’s cabinet and the deans and faculty within Boston’s nine undergraduate colleges.
Administrators and faculty at BU found that data presented in the NSSE reports were intuitive, helpful, visually attractive, and easy to reproduce. The Student Engagement and Retention Group used BU’s NSSE Snapshot as a primer for the university. However, with over 33,000 students at the campus, the team identified college- and department-level data as most important to improving outcomes. Thus, for more precise information regarding student advising experiences, BU disaggregates their data by college and major.
Communicating NSSE Results with Data Visualization
With 47 NSSE items, communication of results to a university-wide audience during an agenda-filled meeting can be a challenge. Chaminade University of Honolulu integrated the use of colors to communicate NSSE results in lieu of text, tables, or multiple charts and to visibly highlight trends and display differences (e.g., effect sizes). For the purposes of illustration, real data from the fictional NSSEville University are used in displays in place of Chaminade results.
The Chaminade faculty core competency assessment committee was interested in reviewing trends within the NSSE Engagement Indicators as part of their assessment cycle. With simple spreadsheet conditional formatting they displayed multiple years of all 47 Engagement Indicators as a single color index or, a single “heat map”. The colors purple, green, and yellow were used because they stand-out, however, any color scheme works. For trend analysis they used only two colors: green for where the university is above a comparison group; and yellow for where the trend is below the comparison group. By restricting to just the two distinctive colors, a single image makes it possible for viewers to rapidly answer the question: “From year to year, where is the university consistently higher and consistently lower than the comparison groups?” (See Figure 3 for a sample color index using NSSEville data).
For comparison of differences, they used heat maps—matrix representations of data that display individual values and the magnitude of differences in values in colors. Purple denoted largest favorable differences, with smaller differences fading into green, then yellow, to show the largest unfavorable difference. The range of colors enables all viewers to quickly identify where the university has strengths (brightest purple) or weaknesses (brightest yellow) with respect to the NSSE Engagement Indicator items. (See Figure 4 for a sample heat map representing NSSEville data.)
This color display and communication method has been used at Chaminade University of Honolulu to facilitate campuswide discussion and rapid interpretation of the vast amount of information contained in the NSSE reports.
Transparency and Sharing Outcomes
In October 2014, Denison University, in Granville, Ohio, launched on its website a new page, Student Outcomes: The Denison Difference, devoted to assisting internal and external audiences in understanding the value of a Denison education. The page displays results from Denison’s NSSE 2014 administration in an innovative and interactive format. Combining the use of internal survey data, acceptance rates, and alumni narratives with NSSE results, the page is a comprehensive marketing web resource that captures positive student outcomes at Denison.
Denison has identified 13 core undergraduate student outcomes, derived from the institutional mission and general education requirements and developed by a committee of faculty and student affairs staff (Figure 5). These are Civic Life, Global Perspective, Difference Among Persons, Issues of Power & Justice, Agency, Identity Awareness, Quantitative Reasoning, Oral Communication, Written Communication, Analytical Thinking, Critical Thinking, Creative Thinking, and Ethical Thinking. On the new webpage, these outcomes are arrayed around a colorful wheel and, when selected, reveal corresponding NSSE data that demonstrate how students at Denison reach that core student learning outcome. For example, analysts combined response proportions from items associated with Quantitative Reasoning to demonstrate students’ experiences using quantitative skills (Figure 6). According to their analysis, 63% of seniors at Denison reported that their experience has “contributed ‘very much’ or ‘quite a bit’ to their knowledge, skills, and personal development in analyzing numerical and statistical information.” Denison also aggregated responses to NSSE items about how students spend their time, including the amount of hours spent participating in co-curricular activities and volunteering, to create a profile of civic life among seniors. Additionally, Denison presented NSSE results indicating that 93% of seniors spent at least one hour a week participating in co-curricular activities (student organizations, campus publications, student government, fraternity/ sorority, intercollegiate or intramural sports) and 59% of seniors completed at least one hour per week of community service or volunteer work, with the average senior completing two and a half hours of community service each week, demonstrating Denison students’ high levels of co-curricular and community engagement.
NSSE data help Denison’s administrators assess the achievement of their core student learning outcomes and align their work to the institutional mission and commitment to liberal arts education. Also, as this example shows, NSSE data help the university communicate their accomplishments to the external community.
Custom Reports to Shape Expectations for Student Engagement
Drake University, in Des Moines, Iowa, has reviewed their 2013 results and drawn critical questions about how results align with expectations for student engagement, and has used these findings to prompt discussions about where they want their students to be. In order to better communicate their NSSE data and to translate results into something that would encourage change and action at the college and school level, the university disaggregated the data and created reports for colleges and schools. In addition to recreating many of the NSSE aggregate reports, they also recreated their NSSE Snapshot, highlighting the top five items in which each department or college excelled and the five areas in which they were below their peer comparison groups.
For initial analysis, they used the NSSE Report Builder to do some benchmarking and to create a structured report for each college and school. With the Report Builder, first, the Office of Institutional Research and Assessment ran analysis by undergraduate major and exported the data to Excel. From there they created a template, so that the reports could be easily replicated across all of Drake’s many colleges and schools. They programmed each spreadsheet to automatically highlight cells with the five top and five lowest items for first-year and senior students. These custom reports focused on comparative data in two ways—national benchmarking based on major field and internal benchmarking with Drake students in different colleges. For some of the colleges and schools, they also disaggregated the data to align with the outcomes areas for the college or with disciplinary accreditation standards.
To share the reports, staff from the Office of Institutional Research and Assessment set up face-to-face meetings with deans and associate deans to discuss and present the data. Disaggregated results by major were shared side by side with comparative data. The comparative data provided information for administrators to see how the results aligned with areas of need or current priorities and informed discussions about ways to both use and share the data. In response to these reports and meetings, several colleges and schools have taken action. For example, the College of Business and Public Administration has undertaken a review of their curriculum in order to enhance students’ development of writing skills. The School of Education and School of Journalism and Mass Communication have started a review of their curriculums to look for opportunities to enhance students’ development of quantitative skills. Additionally, these conversations led to support for the faculty senate to review the core curriculum and contributed to a recommendation to examine their developmental sequence for greater focus on integrative learning. Reports were also prepared at the institution level to share with several campus groups, including the faculty senate, dean’s council, and academic affairs council. Looking at their Collegiate Learning Assessment (CLA) and NSSE results together, these groups observed an integrative learning gap for Drake students.
Drake’s NSSE Snapshot revealed that Drake students’ ability to take on diverse perspectives was low in comparison that of their peers. Specifically, while Drake students were comfortable experiencing different cultures, they were less comfortable discussing values with others and trying to understand somebody else’s perspective.
These results were shared with the Academic Affairs Leadership Group, which includes representatives from student life and the assessment committee. To address this issue, a group was tasked with looking at how discussions with diverse others could be addressed systematically in Drake’s core curriculum. Additionally, they are looking across other internal surveys and existing data to contextualize this discussion. Overall, in response to this issue, campus conversations have shifted to finding ways to better emphasize critical thinking and integrative learning.
Drake is also using NSSE data to examine effective practices that support student engagement and learning. The graphic display in Figure 2 was created to illustrate the positive gains in engagement at Drake related to participation in High-Impact Practices (HIPs) and mirrors an approach used in an AAC&U publication, Assessing Underserved Students’ Engagement in High- Impact Practices. The display offers an accessible way to discuss HIPs that influence student-faculty interaction with the goal of determining strategies for increased impact across the university. Using a combination of data triangulation, customized report creation, and sharing results with relevant campus committees, Drake has developed a clear picture of the student experience and where it can be enhanced.
Informing Strategic Action with Student Engagement Results
In its most recent reaccreditation review with the Middle States Commission on Higher Education, Gettysburg College, a four-year residential arts and sciences college in Gettysburg, Pennsylvania, was commended by the visiting team for its exemplary and innovative practices of effective, systematic use of indirect assessments, including NSSE, and for improving student learning. The visiting team commented, “Assessment data were among the motivators for considering improvements and change; multiple constituencies engaged in the discussions and established new goals and benchmarks; resources were allocated, despite unexpected constraints after the 2008 financial crisis; and new data demonstrate some significant achievements in each area.”
Gettysburg’s strong, systematic use of NSSE results and, in particular, its use of data to inform change, is fostered by the regular review and consideration of NSSE and other survey results by a wide range of groups and committees including the President’s Council, the Committee on Institutional Effectiveness, the Committee on Learning Assessment, task forces (such as the Academic Advising and Mentoring Task Force and the Task Force on the Intellectual Life of First-Year Students), at faculty meetings, and at divisional meetings and retreats.
The comprehensiveness of sharing NSSE results helped inform the development and refinement of the college’s strategic action goal on engagement, which states, “Gettysburg College will promote intellectual development, engagement, and leadership through active and innovative learning experiences.” Within this goal the college detailed several subgoals and implementation strategies, including academic rigor/ level of academic challenge and high-impact learning opportunities. In these approaches, many initiatives have been expanded or created to address areas of concern identified in NSSE and other assessments.
Two examples of NSSE data use related to the engagement strategic action goal highlight Gettysburg’s practice. Results for Student-Faculty Interaction suggested the potential to enhance student participation in faculty-mentored research. In response, the college has prioritized increasing support for student summer research and senior projects, expanding opportunities for student travel to academic and professional conferences and providing a place to showcase student research and creative work. In recent years, the college has expanded student participation in research through increased financial support and new initiatives.
For example, the college launched Celebration—The Annual Colloquium on Undergraduate Research, Creative Activity, and Community Engagement to provide ongoing opportunities for students to present the results of their undergraduate research and creative projects to their faculty, peers, and others. Celebration brings together a wide range of engaged and energized students as they showcase their work from capstone research, independent study, coursework-related research, study abroad, service learning, and arts. The investment in faculty-mentored research has paid off, with more faculty and students reporting their participation in this high-impact practice.
A second effort related to addressing Gettysburg’s engagement goal relied on NSSE results for student participation in internships, which were lower than expected. The college used these results as a call to action and has greatly expanded career-preparation programs for students through an initiative called “The Career Connector Challenge” and through closer collaboration between the development and alumni office and center for career development. The Career Connector Challenge was launched in 2010, by Gettysburg’s Center for Career Development, to create new career-related opportunities for students, including networking dinners, summer internships, shadowing opportunities, and informational interviews, through an intensive campaign among alumni, parents, and friends. Since this initiative was launched, student-reported participation in career internships, externships, and job-shadowing experiences have increased, and Gettysburg’s NSSE 2014 results affirmed that student internship participation as reported by seniors now exceeded their Carnegie Classification peers (earlier results showed that Gettysburg was similar to its peers on this measure).
Informed by NSSE results and other assessments, the college has greatly increased its support for faculty-mentored research, internship, and other high-impact learning experiences through resource allocation and new initiatives. Gettysburg College will continue to monitor its progress through various benchmarking assessments, including their next NSSE administration.
Staff from the Office of Institutional Research and Assessment at Holy Family University (HFU), in Philadelphia, Pennsylvania, coordinated two lunch-and-learn sessions on campus to introduce NSSE and the Faculty Survey of Student Engagement (FSSE), to share 2013 survey results, and to encourage faculty and staff to use the results in campus assessment and improvement projects. The first session, focusing on NSSE, began with a presentation about what NSSE is, why the campus participates, how the NSSE instrument has changed, and HFU’s participation history. Staff shared their gains from NSSE participation, highlighting the reports and resources from their latest administration along with results demonstrating the link between NSSE’s themes and HFU’s mission. The opening presentation concluded with examples of other institutions’ uses of NSSE results (from Lessons from the Field, Volumes 1 and 2). For the interactive portion of the session, the staff split the audience into two groups—one taking the role of first-year students and the other the role of seniors. Each group was tasked with predicting HFU student responses on Engagement Indicator items and how these would compare to comparison-group responses. As actual results were revealed, attendees discussed how they differed from the predicted results, why that might be, and how the campus could work together to improve student engagement. For the final portion of the session, the whole audience, taking the role of seniors, predicted senior responses on the High-Impact Practice items. HFU’s second lunch-and-learn session introduced FSSE and detailed why HFU participates, presented results in HFU’s NSSE–FSSE Combined Report, discussed differences between faculty and student responses, and generated suggestions from the results for improving instructional strategies. Following up on these sessions, institutional research and assessment staff created for faculty and staff an internal Blackboard webpage displaying both NSSE and FSSE reports.
Sharing and Responding to NSSE Data
Mills College is a small, selective liberal arts college for women located in Oakland, California. The college has participated in several NSSE administrations, including 2014, and has also participated in the Development of Transferable Skills Topical Module.
Mills creates infographic displays of data results as standard procedure for all of the surveys they participate in. Their NSSE infographic was created after receiving their 2014 results, and highlights NSSE items with Mills student results side by side with their peer. In providing snippets of data, via text or a small table, the infographic communicates NSSE results to help all members of the Mills community better understand “who they are.” The infographic also demonstrates to the students that the administration is doing something with the data and using the results.
Copies of the infographic were printed and displayed all over campus, and digital copies were shared directly with all members of the Mills executive cabinet. Further, the infographic was sent directly to offices and individuals who would find the information particularly relevant to their work. For example, the infographic and Development of Transferable Skills Topical Module results were shared with the admissions and career services offices. Admissions has used the infographic when speaking with potential students. Career services plans to discuss with students transferable skills results, paired with individual experiences and coursework at Mills and how skills such as verbal and written fluency and analytic inquiry relate to the job market. Additionally, the alumnae office has found the infographic to be very useful for their work and communication with Mills alumnae.
After the infographic was released, a full report on NSSE results was prepared and shared with the cabinet. The report was also posted on the campus intranet for all faculty and staff to access, generating more interest and campus activity around this report than ever before. The NSSE results led to a campus discussion around student social interaction and oral presentations. Specifically, the faculty were interested in the student responses to the questions about giving a course presentation. Responses revealed scores that were lower than Mills faculty would have liked, and that first-year students gave more class presentations than seniors did.
Faculty enthusiastically became involved in follow-up assessments to NSSE results. The campus is now engaging in an assessment to better understand how and where change might occur to increase the occurrences of oral presentations embedded within the general education curriculum. A committee of faculty, staff, and student representatives is collecting evidence to inform recommendations for action, including recordings of about 25 hours of senior seminar culminating presentations. Faculty will use a rubric (informed by the rubrics of MIT and the Association of American Colleges and Universities) to evaluate the oral presentations; their feedback will be shared with the committee to shape recommendations for refining the general education curriculum. The curriculum reform will reach across general education and into the disciplinary areas.
The next steps for the committee will then be to map where oral presentations are happening currently and to host faculty workshops on developing student oral presentation skills. The committee plans to map how oral presentation is first introduced in general education and how its development and practice continue within each degree program.
In addition to responding to the student oral presentation concern, Mills College was interested in responding to concerns about service-learning and deep-level processing. Mills 2014 NSSE results revealed that there was room to increase the number of opportunities for students to participate in service-learning. During spring 2015, a handful of designs to better integrate service-learning into general education were piloted. The curriculum transformation committee is also looking at NSSE results to shed light on deep-level processing, specifically, responses to “During the current school year, about how much of your coursework emphasized the following: analyzing an idea, experience, or line of reasoning in depth by examining its parts?”
The most significant factor in helping faculty become receptive to NSSE results were the insights gained through examining them, which informed existing campus conversations on general education reform. With the current strategic plan calling for an overhaul of general education, the curriculum transformation taskforce was very interested in what could be learned from NSSE results, especially the questions on oral presentations, quantitative reasoning, and social interaction. Faculty found the Engagement Indicators to be actionable because they provided insight into what the faculty could do better. Mills has always done well on the NSSE construct of Level of Academic Challenge, but the updated survey provided additional insights into how students are being challenged. Additionally, the release of the college’s NSSE infographic, followed by the detailed report, may have helped gain attention and build interest in the results. Individuals at Mills are paying more attention to the NSSE reports and are interested in discussing the results. The combination of the campus already prioritizing curriculum transformation, via the strategic plan, and the release of their NSSE 2014 results seemed to be perfectly timed to focus attention on using results.
Introducing the Campus Community to NSSE
Nazareth College, a religiously independent college with 2,000 undergraduates, located in a suburb of Rochester, New York, has participated in five NSSE administrations, including the updated survey in 2013. Institutional research (IR) staff at Nazareth implemented a comprehensive approach to introducing NSSE to the campus. To increase awareness of the upcoming NSSE administration and to stimulate student participation, they alerted faculty who taught primarily first-year or senior-level courses and encouraged them to mention the survey in their classes. Timing this email alert with the survey invitation schedule added relevance to the in-class reminders.
Following Nazareth’s successful administration, and before the college received NSSE results, IR staff distributed copies of the NSSE instrument to faculty and staff, inviting them to consider first what they wanted to learn from the results. In addition, an IR staff member brought copies of the survey instrument to a meeting of all campus directors to get the attention of campus leaders and to spark their anticipation of the arrival of results. The IR staff goal was to create widespread understanding of what NSSE data could tell them.
The NSSE Snapshot from the college’s Institutional Report was shared with two campus groups at Nazareth. The first group to consider the results was the President’s Council, composed of individuals representing each academic division as well as staff and administrative offices across campus. The Snapshot was then shared with a wider campus audience including all directors of programs and units. The goal was to create a campus-wide understanding of how the data could help them learn about the undergraduate experience. Different aspects of reporting on the Engagement Indicators (EIs) were discussed in these meetings, including the box-and-whisker charts, which in addition to demonstrating an admirable mean score also displayed a range of experiences among students. Drawing from the faculty and staff discussions of the Snapshot, institutional researchers reviewed their data to look at students who stayed and those who left. Narrowing this examination to students who left with a 3.0 GPA or better, IR staff found that these students scored very low on survey items related to effective teaching and, in particular, the organization of instruction. Faculty examined these findings more fully and considered ways of responding.
To create more actionable and tailored student engagement reports for departments, the IR staff generated customized reports using the campus’s production reporting tool, SAP Crystal Report. The first page of these reports displayed each department’s student response rate along with the rate for the whole campus and also with the rate for peers in a comparison group. The department reports similarly broke out the EIs, showing the responses for all items in each EI, again, with side-by-side display of results for the department, the institution, and the comparison group. In total, IR created 20-page reports (including results for certain questions by residency status, athletes, etc.) for about 22 different departments across campus. To follow up after the reports were distributed, IR staff conducted individual meetings with department faculty to clarify findings and to examine specific differences for students in these programs. In particular, the IR staff helped faculty make sense of data from small departments.
Nazareth used results from the core survey and also from the Academic Advising Topical Module to explore the relationship between instructional and advising practice and patterns in student persistence. Students with high GPAs who left the college had lower scores on certain advising items. These results were shared with the academic advisement department and the academic departments. Nazareth adopted a personal approach to introducing the campus to NSSE and created many opportunities for discussion of results. Plans are to continue to have campus conversations to examine results.
The Ohio State University (OSU), a public research university in Columbus, Ohio, has participated in NSSE six times and has launched an initiative to focus on enhancing the campus’s use of NSSE data and to create systems to more effectively support assessment, reaccreditation, and institutional improvement across campus. The university is interested in merging their NSSE results with other data sets on campus. In order to do this, they initially organized a group to trend comparative data from NSSE. Next, they plan to combine the NSSE results with other data sets, and then create division- and department-level reports. While waiting for their custom reports, departments have been provided raw NSSE data for their own analysis. When asked what they were most eager to learn from the NSSE results, departments reported being particularly interested in learning about the behaviors of graduating seniors.
In an effort to systematically share NSSE data across campus, the Office of Institutional Research and Planning (OIRP) designed an innovative approach to connect with the rest of campus. First, each person in IR made two lists: (a) informal list of who they currently work with or have partnerships with on campus; and (b) a list of who they would like to partner with on campus. Next, the IR department looked across all of the lists to see who could be NSSE users, who would be great candidates of users, and who or what parts of campus were not connected to NSSE data. To better connect NSSE results with departments that have their own analysis or assessment specialist, the OIRP invited department-level analysts to meet with them about how they can use NSSE within their department and in their work (for example, internal surveys or department-level surveys). OSU has a large number of colleges and divisions on campus, so many—but not all—offices have their own assessment expert. Utilizing the lists generated internally, the OIRP will make sure the offices that need support or training on NSSE data will receive it. Over time, the OIRP will work with the parts of campus that are less connected to NSSE data to better incorporate them. Through this intentional partnership effort, OSU is working to make NSSE results salient to more campus units.
Pace University, a multi-campus research institution in the New York metropolitan area, administered NSSE every year from 2002 through 2012 and the updated survey in 2013. While initially saddened to bring closure to several multi-year studies, campus leaders realized that NSSE 2013 would open a new chapter of NSSE studies providing different perspectives on institutional questions. To celebrate all they had learned and the action they had taken on their institutional assessment results, Pace published a NSSE Retrospective recounting the many ways NSSE has made a difference for teaching and learning, and, especially, for students at Pace.
To investigate institutional concerns such as retention, for example, Pace matches the most recent NSSE data to each fall semester’s roster of first-year students who stayed and those who left. Analysis of these results provides valuable clues to student behavior and suggests actions that faculty and student success professionals might take. A study of sophomore retention at Pace used the NSSE responses of second semester first-year students who would soon be sophomores to provide insight into how to address “sophomore slump” and resulting attrition. Results from the early years of NSSE administration at Pace highlighted the need to pay more attention to student-faculty interaction. To address this need, Pace’s Center for Teaching, Learning, and Technology, along with the University Assessment Committee, developed a series of faculty development workshops using NSSE results. These workshops included breakout sessions in which faculty discussed NSSE results and shared best practices. Results from subsequent NSSE administrations showed upward trends in the student-faculty interaction scores. With NSSE 2013, Pace opens a new chapter in its increasingly sophisticated efforts for improvement. The updated survey’s potential for deeper examination of student- faculty interaction through the Engagement Indicators, its expansion of the quality of relationship questions, and new quantitative reasoning items invite fresh insights and fuller understanding of important educational issues.
Rhode Island College (RIC), the state’s first public institution of higher education, now serving approximately 9,000 students in courses and programs on and off campus, has participated in NSSE five times, including in 2013. When sharing their 2013 NSSE results with the RIC community, the assessment/ institutional research team prepared customized presentations that highlighted RIC’s results in relation to those of carefully selected comparison institutions. In addition, identical NSSE items were compared directly, over time, between 2013 and previous years’ administrations. Presentations were made to RIC’s executive team, student affairs personnel, and faculty involved and interested in assessment.
To further encourage reflection on and improvements in student learning and engagement, RIC created a webpage providing a greater number of resources to faculty and staff. Through this public resource with NSSE results, the college sought to foster the use of assessment data across campus. The webpage features a comprehensive report highlighting NSSE data and longitudinal changes in RIC results alongside results from RIC’s three comparison groups, as well as a short report focusing on data most relevant to faculty. Updating benchmarking for current campus initiatives related to NSSE 2013 item-level results, this short report can facilitate faculty and staff discussions of how initiatives are impacting student engagement and learning outcomes.
Improving Academic Advising
In reviewing their Academic Advising Topical Module results, SUNY Oswego administrators identified an opportunity to improve their advising activities to better meet student needs. To gather more details on where improvement was specifically needed, the university’s subcommittee of the campus retention committee, the academic interventions group, invited a sample of students from all class years as well as the entire faculty and staff populations to complete a survey on their advising experiences. Students noted that some advisers lacked the time or knowledge to suggest relevant experiences like internships, study abroad, and career opportunities. Faculty advisors indicated that more training and better access to resources were needed to support students experiencing academic difficulty. Further, the faculty also reported wanting a lighter advising load in order to provide more individualized advising.
The academic interventions group used the NSSE advising module results paired with the campus follow-up survey as leverage for the creation of two new student academic support specialist positions on campus. These two professionals meet with “at-risk” students to help ensure they are staying on track by providing them information about resources like tutoring, counseling, and study skills workshops.
Members of the academic interventions group also developed a set of five “advisement boot camp” sessions—two “basic training” sessions and three sessions on advanced topics. Members reached out to some of the campus’s “super advisors” for suggestions of topics to be covered in each session. The provost’s office incentivized attendance at three of the events with a dinner and tickets to a campus performance following the training sessions. To encourage attendance, the academic interventions group used a flier with a catchy logo, emails, campus announcements and digital signage distributed to the faculty, advisement coordinators, and department chairs.
The two basic boot camp sessions, offered at two different times to fit attendees’ schedules, introduced attendees to the campus resources available to help them and also provided them with a list of hyperlinks they could bookmark in their browsers to get directly to the issue at hand. The sessions were informal, with some lecture style presentations, but participants reported that the most useful part of the boot camp was the interaction between colleagues who had good tips and excellent questions for each other.
Similar to the basic session, the advanced topics session was offered multiple times to provide scheduling flexibility. This session gave advisors a great deal of information on calculating GPA’s, advice to share with students on “undoing a bad semester,” transfer student issues, graduation problems, campus resources for career services, international education, major exploration, and student support services. Facilitators ended these sessions with case studies that advisors worked on in small groups and on which they then presented their perspectives. The exchange between advisers sharing how they dealt with difficult situations and how they worked through the issues was one of the more valuable aspects of the sessions.
The academic intervention group members undertook an assessment of these sessions that showed advisors were ready to employ more effective practices. SUNY Oswego plans to continue the boot camp series. They will repeat both the NSSE advising module and the internal survey as measurements for growth and ways to continue to improve their advising efforts.
Truman State University, a public liberal arts and sciences university in Kirksville, Missouri, established a committee to evaluate frameworks and rubrics associated with the university’s commitment to enhancing the following characteristics in its graduates: (a) understanding and articulating well-reasoned arguments; (b) demonstrating courageous, visionary, and service-oriented leadership; and (c) living emotionally and physically healthy lives. The committee looked to Truman’s NSSE results on higher- and lower-order learning skills to learn more about their students’ experiences. NSSE results revealed, for example, that first-year students and seniors reported a much greater emphasis on the lower-order task of memorization than Truman faculty reported in the Faculty Survey of Student Engagement (FSSE), suggesting a significant difference between the perceptions of faculty and students. More broadly, NSSE data suggested that in areas related to higher-order learning, Truman students were performing near or slightly above the level of students at comparison institutions. Truman’s findings on higher-order learning influenced their Higher Learning Commission Pathways Project to assure quality and demonstrate continuous improvement. Moving forward, the university will craft rubrics for higher-order thinking to help students and faculty recognize connections across courses and among disciplines, creating an integrated understanding of the curriculum while helping faculty be more efficient and intentional in their teaching and letting students know better what is expected of them.
Creative Survey Promotion
In the fall of 2013, the University of Massachusetts Dartmouth’s Office of Institutional Research and Assessment spearheaded a campus-wide campaign called “You Spoke. We Listened” in partnership with the offices of student affairs and academic affairs. The ongoing publicity campaign advertises feedback that students provide through discussion groups and surveys like NSSE and highlights programmatic and curricular changes that are implemented as a result. The Office of Institutional Research and Assessment garnered support for the campaign from the highest levels of the university by discussing the campaign at meetings with assistant and associate deans, the faculty senate, student government leaders, and others on campus. The campaign is delivered through a wide variety of formats (see Figure 11), including large format posters, flyers, campus TV advertisements, advertisements in the student newspaper, and table-tents. Additionally, a page on the campus intranet was developed that is devoted specifically to telling students about NSSE.
Through longitudinal analysis of NSSE data and other campus surveys, university administrators identified supportiveness of the campus environment as an area in need of improvement. Trend analysis of means for the Supportive Campus Environment Benchmark across the NSSE 2005, 2008, and 2011 administrations indicated a consistent pattern of significantly lower mean scores for freshmen and seniors at UMass Dartmouth compared to the university’s peers. To further investigate these findings, focus groups were conducted with freshmen and seniors in March 2013 to gather in-depth, qualitative data about overall student satisfaction and, more specifically, student satisfaction with the supportiveness of the campus environment at UMass Dartmouth. Focus group findings that informed university changes were in the areas of academic support from advisors and administrative offices, the transition from high school to college, and seniors’ comments on academic facilities. The following initiatives were informed by NSSE data analysis and were publicized in the campaign: a. creation of an Office of Undergraduate Research Office “to promote undergraduate research, support student researchers, and disseminate the products of student research”—a formal space devoted to better support undergraduate students in their research endeavors; b. development of The Student Transition and Achievement Resource (STAR) Center in the College of Arts and Sciences: “Professional academic advisors, peer mentors, and faculty advisors from most Arts & Sciences majors and minors help students plan their academic careers thoroughly and thoughtfully.” c. development of an engineering freshman experience course; d. making Achievement Possible (MAP-Works): “In MAP-Works, faculty and staff connect and communicate with students and each other in a first-year community dedicated to Making Achievement Possible in the academic arena”; and e. implementation of college student success plans.
Beyond the value of communicating important changes being implemented as a result of student feedback, “You Spoke. We Listened.” was used strategically as a recruitment tool for NSSE 2014 to help prompt students to participate in the upcoming NSSE administration. The office of institutional research and assessment coordinated with the housing and residential education office to effectively promote NSSE 2014 (for example, by sliding handouts under students’ dorm-room doors) and, as a result, the institution observed an uptick in first-year responses.
Reporting and Tracking Performance
The University of Massachusetts Lowell has administered NSSE numerous times since first participating in 2000, and the university triangulates findings from NSSE with their institutional research and registrar data. UMass Lowell constructed a new comprehensive strategic plan titled “UMass Lowell 2020,” organized around five pillars, (1) transformational education, (2) global engagement and inclusive culture, (3) innovative research and entrepreneurship, (4) leverage our legacy and our place, and (5) entrepreneurial stewardship.
To monitor progress toward their goals, UMass Lowell created a Report Card (see Figure 8) to track and evaluate performance, guide all decision-making on campus, and inform the Strategic Planning Commission. Mapped around the five pillars, quantifiable items are listed and tracked. UMass Lowell’s 2013 and 2014 NSSE results, specifically the overall student satisfaction item and High- Impact Practice results, serve as indicators for the transformational education pillar (see Figure 9). The institutional goal for 2020 is to increase overall student satisfaction and to achieve 70% of first years and 80% of seniors engaged in High-Impact Practices.
The University of North Dakota (UND), a national public research university located in Grand Forks, ND, featured NSSE results in their 2013 Higher Learning Commission (HLC) self-study for reaccreditation. NSSE results were discussed in their presentation of evidence for several dimensions specified in HLC Criterion 3, Teaching and Learning: Quality, Resources, and Support. For example, HLC Criterion 3.B.3 states, “Every degree program offered by the institution engages students in collecting, analyzing, and communicating information; in mastering modes of inquiry or creative work; and in developing skills adaptable to changing environments.” UND discussed how they address learning goals at the program level and undergraduate core curriculum learning outcomes, and then incorporated results from the following NSSE items in response to the criteria: a. During the current school year, about how often have you examined the strengths and weaknesses of your own views on a topic or issue? b. During the current school year, about how often have you tried to better understand someone else’s views by imagining how an issue looks from his or her perspective? c. To what extent has your experience at this institution contributed to your knowledge, skills, and personal development in learning effectively on your own? d. To what extent has your experience at this institution contributed to your knowledge, skills, and personal development in solving complex real-world problems?
Senior student results on several NSSE items were discussed in response to HLC Criterion 3.B.4, which states “The education offered by the institution recognizes the human and cultural diversity of the world in which students live and work.” Senior student scores on the educational gains item “understanding people of other racial and ethnic backgrounds” were lower than UND desired, and these results were candidly discussed in the institution’s reflection on their effectiveness on this criterion. The full narrative contextualized results in several ways. First, they stepped back and reflected on the students’ responses to the diversity-oriented questions and the relative lack of racial and ethnic diversity found within the student population. Second, they recognized that student responses to these NSSE questions were collected before the changes to their general education diversity requirements were fully implemented. The university expects improved results once required courses and curricular adaptations are fully in place.
In response to the HLC Criterion 3.C.5, “Instructors are accessible for student inquiry,” UND paired findings from other campus assessment instruments with NSSE results to demonstrate the accessibility of faculty. Overall, their NSSE results demonstrated that the percentage of first-year students interacting with faculty inside and outside of the classroom had increased over time, while the percentage of seniors reporting interaction with faculty remained consistent since 2005. Again, the university thoughtfully resolved that although their NSSE results related to student-faculty interaction were consistent with peer institutions and other assessment results, their efforts to improve the quality of student-faculty interaction would continue.
NSSE results also informed UND response to HLC Criterion 5.C.3: “The planning process encompasses the institution as a whole and considers the perspectives of internal and external constituent groups.” In preparation for their self-study, UND launched an Undergraduate Learning Working Group to reflect on and review local data and national best practices, and created seven recommendations for action. One of the recommendations was to create a first-year experience (FYE) pilot course. This course was launched in 2011–2012, and information obtained from the pilot was used to plan for a long-term FYE program at UND. The pilot course program was assessed using a variety of data including student inputs including ACT scores; GPAs; retention outcomes; results from the College Student Inventory (CSI), the Student Information Questionnaire (SIQ), and the Beginning College Survey of Student Engagement (BCSSE); end-of-semester course evaluation results; information from a reflective assignment completed by all students and scored by a faculty team; and more. NSSE results were then used to assess the impact of the FYE pilot course. By comparing NSSE scores for first-year students in the pilot FYE course to NSSE scores of both first-year students who did not experience the pilot and several past years of first-year students, UND concluded that the FYE pilot course made a positive contribution to student engagement in the first year.
In addition to employing NSSE results in the university’s successful reaffirmation and their quality improvement project, UND has also prepared brief reports to communicate key messages about student engagement for particular constituencies. For instance, UND presented results of questions related to the Effective Teaching Practices Engagement Indicator to demonstrate a need for clarity of communication between students and faculty, especially regarding feedback on assignments. As much as possible, UND drilled down into their data by creating dashboard reports for programs and comparing program-level data to the overall UND results. UND also mined students’ comments on NSSE to provide qualitative feedback to departments. By presenting NSSE data graphically and clearly, UND provided relevant results to academic programs with comparisons to its public peers and Carnegie classification group, uncovering opportunities for improving instruction. UND transformed their NSSE results into digestible, useable formats—exemplifying how an institution can selectively present NSSE results important to specific groups within their institution.
Custom Reports to Shape Expectations for Student Engagement
The University of Northern Iowa (UNI), in Cedar Falls, IA, has participated in NSSE annually since 2006. The campus has shared their NSSE results in a number of ways; here, we highlight three most recent examples.
In 2010, UNI completed a Foundations of Excellence® a self-study, to look within their first-year students’ experience, as part of their accreditation review with the Higher Learning Commission. UNI reviewed nine dimensions of the first-year experience, using NSSE results among over 200 other sources of data when evaluating key performance indicators. Within the self-study they identified areas where they could improve and measures that would be used to track progress; these later informed the development of an action plan to improve first-year student learning.
One response to the self-study was to develop and measure first-year learning outcomes. For example, FY Outcome #1 states, “By the end of the first year, students will be able to articulate and examine their personal values.” NSSE items used to track progress toward this outcome include “During the current school year, how often have you examined the strengths and weaknesses of your own views on a topic or issue?” and “During the current school year, how often have you learned something that changed the way you understand an issue or concept?”
Another response to the self-study was to develop a first-year requirement called First-Year Cornerstone—a two-semester, six-credit course sequence that focuses on integrated communication, student success, and civility. First-Year Cornerstone credit counts toward general education requirements and is both writing and speaking intensive. It is taught by faculty, with purposeful and systematic assistance from student affairs professionals, academic learning center staff, and library professionals. NSSE results are shared with new instructors in training to teach Cornerstone to help them better understand who the first-year students are. During training, they are asked, “How many of our first-year students live on campus?” and “How many of our students work for pay off campus?” NSSE results for first-year students including residential status, employment status, and enrollment status. Using student responses to these and other NSSE items as part of the training helps to emphasize meaningful, frequent student-faculty interaction as key to an integrated learning environment in the course. Because FY outcomes are mapped to First-Year Cornerstone, UNI is able to use NSSE and Mapworks data to compare students who take the course with those who do not. For students in the course, NSSE results are also used to track these NSSE Engagement Indicators: Discussions with Diverse Others and Student-Faculty Interaction.
A second area for use of NSSE results at UNI is to provide information related to the campus diversity goals. Results for the NSSE Engagement Indicator, Discussions with Diverse Others, are used to map progress on key performance indicators (KPIs) such as “Educate all students to ensure that they are prepared to live and work successfully in a diverse world.” To track progress on this KPI, campus data are paired with results for the following NSSE item:
During the current school year, how often have you had discussions with people from the following groups: from a race or ethnicity other than your own, with religious beliefs other than your own, and with political views other than your own.
The KPIs are posted on the campus website, on an interactive web page where faculty and staff are able to view KPIs, select progress benchmarks, and review data and progress. The results are also shared at the annual town hall meeting on diversity and inclusion, with results displayed at a booth for others to see.
Finally, UNI is incorporating NSSE results into other initiatives across campus. UNI has used NSSE results as indicators to track progress on their strategic plan. Academic advisors use NSSE item-level results as part of the assessment plan for their office. The campus has used NSSE’s Report Builder to dig deeper into the data, looking at results for specific student populations such as first-generation compared to non-first-generation students. Information from NSSE and the writing Topical Module has been shared with the University Writing Committee. NSSE results have also been shared with the president’s cabinet and his Executive Management Team. UNI’s NSSE data use demonstrates the intentional integration of relevant NSSE results into the work of specific campus audiences and committees to inform initiatives, assess outcomes, and demonstrate educational strengths.
An infographic (see Figure 12) summarizing Beginning College Survey of Student Engagement (BCSSE) 2013 results at the University of Puget Sound in Tacoma, Washington, was distributed on postcards to new students and posted on electronic screens around campus to share the results and shape campus norms. This promotional campaign also generated interest in the spring 2014 NSSE administration, resulting in a higher response rate and allowing the university to study combined results from BCSSE and NSSE.
When faculty reviewed results from Puget Sound’s past NSSE administrations, they noted, among other findings, lower-than-expected levels in students’ responses to questions about experiential learning. Partly due to these findings, a task force was set up to review experiential learning at Puget Sound, with action in 2014–2015 to include more prominent web-based information about experiential learning opportunities.
Inspiring Greater Use of Assessment Results
The University of Saint Mary (USM), in Leavenworth, Kansas, participates in various external and internal surveys to gather direct and indirect evidence of educational effectiveness at many levels of the university. About three years ago the campus revamped their assessment efforts, in response to feedback from their accreditor, the Higher Learning Commission (HLC), which encouraged them to make their campus assessment efforts more data-driven. HLC wanted to see more data-based and informed decision making.
In response to the feedback for expanded assessment efforts, USM faculty and staff examined the measures they had in place and solicited further feedback. The faculty unanimously echoed the desire to create a combination of internal and external assessment measures. As such, they realigned their University Learning Outcomes (ULOs) to three assessments. First, each semester faculty report student achievement of ULOs. Second, first-year students and seniors complete a direct assessment of achievement by participating in the CLA+ (CAE’s collegiate level assessment). Third, first-year students and seniors report on their behaviors and time on task through annual participation in NSSE. Combined, the campus is able to look across faculty reports of student learning, students’ performance, and students’ reports about their behavior and engagement. Additionally, for comparisons, the campus can look at national data, usually in percentile ranks.
Upon receiving their 2014 NSSE results, USM distributed copies of their NSSE Snapshot to senior administration, vice presidents, and the campuswide assessment committee. Additionally, a series of presentations focusing on specific NSSE items and groups of students were presented to different interest groups across campus that included the faculty and student life.
NSSE results and other assessment data are also regularly discussed at USM’s semi-annual faculty institute, an all-faculty meeting held at the start of each semester. During one such institute, faculty reflected on the results from the assessment metrics in place. Looking at USM’s results from the CLA+, faculty saw that the students did not perform as well as they would have expected or wanted them to. To dig deeper, they looked at their NSSE 2013 results regarding students’ reports of time on task, specifically, how much time they spent studying and preparing for class and the number of papers they wrote over the course of the year. The faculty were very concerned by their NSSE results. The students reported low amounts of time on task or studying and preparing for class in comparison to their NSSE comparison groups. Additionally, the students reported having written an average of 30 pages over the course of the year, far below the faculty expectation for first-year students and seniors.
In response to these results, the faculty had conversations about how to increase the time students spend studying and preparing for class and the amount of writing they do. Several efforts were also implemented across the curriculum that focused on strengthening students’ critical thinking skills including critiquing and making an argument. For their NSSE 2015 administration, after debating which Topical Module to select—either Learning with Technology or Experiences with Information Literacy—the campus decided to administer Learning with Technology for two reasons. First, the campus was in the process of applying for an external grant relevant to increasing technology in the classroom for students at risk of dropping out, thus, creating the necessity for a baseline measurement of student competency in technology. Second, the questions in the technology module captured more points of interest to current USM initiatives than did the other Topical Modules.
In addition to the NSSE core survey and the technology Topical Module, USM also participates in the Catholic Colleges Consortium, which administers its own customized question set, appended to the core survey. USM has large student involvement in their campus ministry program, and, with the consortium questions, USM is able to see how they compare to other Catholic institutions. To promote campus conversations about outcomes from their first-year experience course and campus ministry program, USM has shared these results with faculty and student life.
Sharing Results and Using Data for Quality Enhancement
The University of Texas at Tyler (UT Tyler) has made use of its NSSE data in a number of ways. During the 2014 faculty and staff convocation, the president highlighted NSSE results to show that the student-faculty interaction scores among first-year students at UT Tyler were significantly higher than those for any other UT System university, Carnegie peers, and all NSSE schools—supporting the university’s commitment to maintaining a 16:1 student-to-faculty ratio and its emphasis on student-faculty interaction. The president’s fall newsletter, distributed on campus and to the community-at-large, featured information from UT Tyler’s Snapshot report, the NSSE Institutional Report’s easily digested summary of key findings. Notably, the results reflected improvement in senior student scores over time. The newsletter reminded faculty and staff that student engagement increases a student’s likelihood of success and congratulated those whose efforts contributed to the institution’s improved results.
NSSE’s ten Engagement Indicators were included in program-level conversations at UT Tyler about assessment for ongoing improvements based on student feedback. The university also launched an initiative to fully document the use of high-impact practices (HIPs) in undergraduate education. Using assessment rubrics drawn from NSSE reports and HIP criteria and curriculum-mapping templates, the institution has been documenting course-related HIPs in each academic program. NSSE results have also been used in the campus’s strategic planning to increase levels of student engagement overall.
NSSE data were used to develop UT Tyler’s Quality Enhancement Plan (QEP) for regional accreditation through the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC). During this process, NSSE results on diversity experiences captured the attention of faculty members and demonstrated a need for more cross-cultural exposure at UT Tyler. The data provided some evidence for the warrant in the QEP proposal, and the institution made the case that significant work could be done to develop students’ understanding of others’ opinions and beliefs as well as to focus on global and cultural education. To address these needs, the university developed the Global Awareness Through Education (GATE) program. After this initial use of NSSE data for the QEP rationale, for ongoing assessment of the QEP, GATE has continued to rely on NSSE results including a range of items focusing on discussions with diverse others, understanding someone else’s views by imagining how an issue looks from his or her perspective, and students’ level of interest in study abroad. For the specified NSSE items, GATE staff track responses and set goals for encouraging these educational practices. The results allow GATE staff to gauge the feasibility of the program to significantly impact the UT Tyler student population going forward. Preliminary results show an increase for all students in Discussions with Diverse Others, with senior responses higher than the average of all three peer groups.
NSSE results have provided UT Tyler evidence of educational effectiveness as well as needed indicators to plan, implement, and track improvement efforts.
Assessing Competencies and Improving Educational Quality
Winthrop University, a comprehensive public university in Rock Hill, South Carolina, has participated in 12 NSSE administrations, including the new survey pilot and NSSE 2014. While engaged in many NSSE data use projects on campus, two examples demonstrate Winthrop’s thoughtful and extensive approach to making use of student engagement results.
Winthrop has been working to update its undergraduate core curriculum, beginning in 2009 with the design of university-level undergraduate competencies. This effort was informed by several sets of relevant information. Accreditation standards from the Southern Association of Colleges and Schools (SACSCOC), specifically Comprehensive Standards 3.3.1.1 on institutional effectiveness in educational programs and 3.5.1 on general education competencies and the Association of American Colleges and Universities (AAC&U) Essential Learning Outcomes provided initial guidance. Winthrop’s NSSE results, in particular the educational gains items (ten items that invite students to report how much their experience at this institution contributed to knowledge, skills, and personal development in the following areas, writing clearly and effectively, speaking clearly and effectively, thinking critically and analytically, etc.), offered more straightforward expressions of undergraduate learning goals. In 2010, the faculty voted unanimously to adopt four undergraduate university-level competencies (ULC): (1) Winthrop graduates think critically and solve problems, (2) Winthrop graduates are personally and socially responsible, (3) Winthrop graduates understand the interconnected nature of the world and the time in which they live, and (4) Winthrop graduates communicate effectively.
To gauge the influence of ULCs on the student experience, Winthrop identified relevant NSSE measures, including Engagement Indicator (EI) items, several High-Impact Practice (HIP) items, and two Topical Modules (Civic Engagement and Experiences with Diverse Perspectives) as metrics. For example, the NSSE 2014 EIs Higher-Order Learning and Reflective Learning are mapped to Winthrop’s ULC on thinking critically. Additionally, the EI Discussions with Diverse Others is a metric for the UCL on interconnectedness. NSSE results are featured on Winthrop’s website, with a specific page dedicated to showcasing how NSSE items map to the ULCs.
NSSE results also influenced the development of Winthrop’s Quality Enhancement Plan (QEP) Proposal for SACSCOC. When initially discussing what topic would be the focus of the project, faculty and staff gathered for a campus-wide conversation. When faculty and staff discussed what to focus on, the idea of global competence came up. Reviewing NSSE data to reflect on how Winthrop students fared against comparison groups over time on diversity measures and study abroad, faculty and staff found that Winthrop students had expectations for participating in study abroad and levels of participation that were on par with their comparison groups. However, they saw the opportunity for increasing interest and rates of participation. Informed by these results, Winthrop designed its QEP Proposal for a Global Learning Initiative. Two NSSE Topical Modules, Experiences with Diverse Perspectives and Global Perspectives–Cognitive and Social, have provided Winthrop additional information about their students’ exposure to and experiences with diversity and global perspectives, and these data are serving as indirect measures to assess Winthrop’s QEP, the Global Learning Initiative. Winthrop also used results from selected NSSE 2014 core instrument items (in addition to items from the Global Perspectives module) as an institutional-level metric to assess its QEP.
Winthrop has also used its NSSE Major Field Report and produced customized reports from the NSSE Report Builder–Institutional Version to provide NSSE results to specific disciplines. Faculty were particularly interested in program-, discipline-, and field-level results as metrics for demonstrating senior achievement of Winthrop’s ULCs. These are important for cyclical academic program review self-studies and are a data source for SACSCOC Comprehensive Standards (CS) 3.3.1.1 and CS 3.5.1 (college-level general education competencies).
After examining NSSE’s institutional website examples webpage, staff members at Winthrop’s Office of University Relations adapted Utah State University’s NSSE infographic concept to develop Winthrop’s NSSE infographic. Winthrop’s videographer used the infographic to combine text, video, and pictures to create a customized Storehouse Story (technology to present visual images as an essay) using Winthrop students and featuring NSSE results. Winthrop’s university relations and admissions staff worked together to create and send out an email blast using the Winthrop Storehouse Story and NSSE results.
Creating NSSE Champions
York University is a public research university with two campuses in Toronto, Canada. With 55,000 students, it is the third-largest university in Canada and one of the biggest in North America. York has participated in seven NSSE administrations, including in 2014. York’s Office of Institutional Planning and Analysis (OIPA) led a carefully planned campaign to engage the larger campus community in a successful NSSE administration. In partnership with the Division of Students, an initial steering committee of four was formed to guide the NSSE administration and promotional efforts.
After brainstorming initial ideas, the steering committee sought one representative from every faculty (equivalent to academic department or program in U.S. colleges and universities), from every front-facing administrative department, and from the student union to serve as a “NSSE Champion” in a larger working group and to lead promotional efforts in their own units. Committee members were recruited via a call to action from the provost and through presentations to various groups on campus. The presentations aimed to raise awareness of the value of NSSE and the importance of improving response rates. They also expressed a commitment to share the results more widely than before. The working group met every two weeks to help develop and test ideas, share techniques, and to maintain promotional momentum.
After rethinking York’s previous practice of not offering incentives to recruit participants, the working group created two kinds of incentives. At the end of the campaign, five $500 tuition waivers and 20 $25 bookstore coupons or student account top-ups were awarded by lottery. During the campaign, every student who completed the survey was awarded, on their student card, an electronic “virtual coupon” that was redeemable for a coffee at the campus Starbucks franchise or for $5.00 off any purchase at York Lanes—the campus retail center. The coupons were donated by the retailers. York’s information technology office developed software to make sure the process from completing the survey to transmission of the coupon was as seamless as possible.
York designed the campaign to be ubiquitous on campus, so that when the initial NSSE invitation email arrived, every student would know what it was for. To promote the survey, the working group used several strategies including the following: a. hired a student to make a teaser promotional video called “What Is NSSE?” that was shared on the York webpages, played on LCD screens around campus, and posted on social media; b. designed an extensive set of webpages with detailed information about what NSSE is and why it matters, what the incentives are, how faculty and staff can get involved, and how to promote NSSE ethically; c. used student-centric social media channels to generate awareness and discussion of NSSE and to encourage participation in the survey; d. displayed professionally designed and branded promotional messages on computer monitors in labs and on screens in classrooms; e. created a digital Communications Toolkit with information and material to help promote NSSE in faculties (e.g., departments) and classrooms.
The toolkit aimed to support conversations between faculty, staff, and student groups and included downloadable promotional postcards and posters, or an online form for requesting print copies of these. Posters were posted in high-traffic areas and postcards were used as information pieces and conversation starters.
One innovative idea for the York campaign was to create an internal competition in which the academic program with the highest participation rate would receive the “NSSE Champion Cup.” During the administration of the survey, the standings were updated every Thursday and displayed on York’s NSSE webpage and on every LCD screen across campus at 2:00 p.m. (see Figure 7).
Results were also disseminated to the community via social media. At the end of the NSSE administration, the cup was awarded to the Schulich School of Business, which won with a final overall response rate of 52%. York’s president awarded the trophy to the dean and his team of NSSE champions, who will keep the cup and bragging rights until the next administration of NSSE, in three years.
The working group continued to meet throughout the NSSE administration to share updates and best practices among the champions. One unit’s faculty found that the best promotional ambassadors were other students and advisors, so they encouraged students rather than administration to talk up NSSE. Another unit’s faculty found beginning-of-class announcements and distributing promotional postcards after class to be effective. Although central oversight of the campaign was critical, it was important for individual units to tailor their own campaigns to fit their culture.
Improving student participation in the NSSE 2014 administration was important to York because the results would be used to help set priorities in campus planning exercises. York replicated all of its institution-level NSSE reports as faculty-level reports with internal benchmarks. York also created item-level trend analyses using NSSE’s Frequencies and Statistical Comparisons report. Because the 2014 survey instrument differed from previous versions, items were grouped into “no change,” “minor change,” and “other,” according to NSSE’s guidelines. In total, about 70 reports were produced.
After looking at their results, one unit observed that their senior students were under-using their academic advisors relative to benchmarks, so they initiated offering the option of academic advising via Skype or smartphone to give busy students more flexibility. Another unit observed that their students had longer commute times than the rest of the institution. As a result, in-person workshops are being replaced with live and recorded webinars and a student was hired to work between classes to encourage other students to complete their financial aid applications. Yet another unit is using their results to encourage departments to include more experiential learning and high-impact practices in their curricula.
When asked what advice they would give to other campuses looking to boost their response rates, York recommended involving as much of the community as possible in the campaign. For large universities in particular, it is important not only to guide the campaign centrally, but also to allow the different academic units and other groups to promote the survey according to their unique cultures. Make sure that students know NSSE is coming well before the invitation letter is sent. Tell everyone how important NSSE is for identifying areas of academic strength and challenge; then associate NSSE results with improvement initiatives once the results are known. Be creative; campaigns that catch the imagination and rally the community can become engagement tools in their own right.
Widely Sharing Results Through Customized Reports
Youngstown State University (YSU), in Youngstown, Ohio, has participated in five NSSE administrations. To introduce NSSE measures and overall results and to prepare campus administrators for more detailed program-level reports, the YSU assessment office edited YSU’s NSSE Snapshot template of results and shared it via a webpage, with a cover sheet introducing the survey, highlighting results, and enticingly describing upcoming opportunities to learn more about results and “dive into the data.” For campus members less familiar with information in the form of data, the introductory sheet also defined common statistical vocabulary.
Upon receiving their NSSE 2013 results, the assessment office created a dissemination timeline outlining how as well as when NSSE results would be shared and communicated across campus. This timeline included such activities as producing and presenting NSSE overview workshops, providing “Lunch and Learn” workshops, and sharing how to access NSSE data. The dissemination timeline was distributed to campus administrators for accountability and support.
Building from resources on NSSE’s website and NSSE’s standard reports, the first phase of sharing NSSE results was an overview presentation. This presentation provided the history of NSSE, NSSE’s Engagement Indicators, the survey instrument, the alignment of NSSE with the YSU mission, the process of selecting comparison groups for NSSE results, and some preliminary results. Some of the charts and graphs from NSSE reports were put directly into the PowerPoint presentation. This overview appealed to individuals who were less familiar with NSSE and statistical data, and a similar overview was presented to department chairs. At Lunch and Learn sessions, which began at noon, individuals were invited to bring their lunch and dive into Youngstown State’s data. There were six different topics at the Lunch and Learns. Four of them were designed in alignment with the four NSSE themes, Academic Challenge, Learning with Peers, Experiences with Faculty, and Campus Environments, and two additional sessions looked within other NSSE question sets on High-Impact Practices, Advising, and Transferable Skills. The goal of these sessions was both to share data and to gather feedback on how to improve practice and promote a high quality learning environment. During the topical Lunch and Learns, the presenter first explained how to look at NSSE results and then distributed copies of NSSE results and gave attendees 5 to 10 minutes to look at the data. Then, they would have a group discussion about what jumped out as interesting, which items merited concern, and which items stood out as exceptional. On items that had results lower than they would have liked or lower than peers, the group would discuss why they thought the responses were lower and what they could do about it.
In an effort to disseminate NSSE data and results to as much of the campus as possible, including to faculty, department chairs, and student affairs professionals, Office of Assessment staff conducted several training sessions on opportunities to access NSSE data via the online Report Builder tool. These sessions were held in a computer resource room on campus and involved an overview and walk-through on how to use the Report Builder, with tutorial resources and time for attendees to generate their own reports with the Report Builder tool. Attendees left the workshop with an annotated guide on how to use the Report Builder, a link to access the tool later, and a sheet with comparison results for the Engagement Indicators. The faculty, staff and administrators who attended these interactive sessions left eager to work with the Report Builder.
Altogether, over 15 Lunch and Learns and Report Builder sessions were held. Afterwards, attendees were asked to share how they had used the NSSE data or their own analysis of NSSE results in their work. For example, the Reading and Study Skills Center, an academic as well as a service-oriented division of the Department of Counseling, Special Education, and School Psychology in the College of Education, employed results from four items in the Supportive Environment Engagement Indicator, alongside support service use statistics and retention rates of students who completed the reading and study skills course— to highlight three views of YSUs supportive campus environment. The Center for Student Progress, which is where peer tutoring and peer mentoring happens on campus, reflected on NSSE results related to students’ perceived educational gains. As one of the larger student employers on campus, the Center for Student Progress wanted to better understand what their student employees were gaining from their experience. This reflection prompted them to add focused student assessment to the role of mentors and tutors who work at the center.
A common observation in discussions of YSU’s NSSE results was that YSU’s student demographics differ from the traditional student profile. YSU’s results indicated that their first-year students spent a lot of time working off campus, that many of their full-time students worked 30+ hours, and that many students also cared for dependents. All of these points became moments for campus discussion around what student engagement looks like at a commuter campus. Follow-up discussions included reviews of pertinent literature about commuter students, including The Disengaged Commuter Student (Kuh, Gonyea, & Palmer, 2001), and considered student support and activities for commuter students.
Overall, YSU designed their NSSE dissemination plan with the intent to share data and reduce gatekeeping of the data. YSU’s account demonstrates that providing some training and guidance alongside sharing NSSE results widely can help facilitate data use and action on results.
Lessons from the Field—Volume 2: Moving from Data to Action
Allegheny College
Assessment for Retention and Persistence
In 2003, a team comprising the dean of the college, dean of students, associate dean for faculty development, representatives from admissions and financial aid, and the director of institutional research reviewed retention rates at Allegheny College and found them lower than desired. In addition, NSSE results revealed low first-year students’ scores on the Supportive Campus Environment (SCE) benchmark, and, in particular, demonstrated that first-year students did not feel well supported at the institution. These results were identified as a possible contributing factor to the low retention rates.
Concerns about retention and interest in providing students the support they need for academic success motivated the creation of the Learning Commons. The campus library was transformed into a learning center to house numerous support services such as tutoring by peer consultants in writing, public speaking, effective use of technology, and study in a variety of academic subjects. The Commons’ 33professional staff also arranges accommodations for students with disabilities; consults with students on practical study skills such as time management, effective reading, and test taking; coordinates new student orientation; and supports the academic advising program. Since dedicating attention to creating a more supportive learning environment, Allegheny has seen gains on several items in the SCE benchmark for first-year students.
Earlier NSSE results regarding students’ limited experiences with diversity were shared with deans and faculty and informed the goal-setting process for Allegheny’s former strategic plan. The strategic focus on diversity helped Allegheny increase the diversity of faculty, staff, and students, and advance diverse experiences in the curriculum. A continued focus on diversity is evident in the “Global & Local Diversity” initiative, one of the four goals of Allegheny’s new strategic plan, Combinations 2020. Other initiatives to increase diversity on campus provide international students with more opportunities to join the Allegheny community; require students to participate in study “away,” either traditional study abroad or domestic internships; and offer an increased number of scholarship awards to students with limited financial resources to expand the socioeconomic background of Allegheny’s student population. The need to increase diversity in terms of faculty and students at Allegheny emerged as an important part of strategic planning in 2009. NSSE results directly influenced this inclusion in the plan and were used to drive decision making.
An action plan standardizing and regularizing academic assessment has been in development by the college’s Assessment Committee. The plan incorporates three elements: student survey self-reports of learning, faculty assessment of student achievement on the senior capstone project, and alumni outcomes. The student survey self-reports involve using three instruments: NSSE, Higher Education Research Institute’s Your First College Year survey, and the College Senior Survey. Survey results are to be reported in six areas: Communication skills; Critical thinking skills; Integrative thinking skills; Academic engagement/challenge; Experiences with diversity; and Overall satisfaction with education experience.
Findings will be published on the Allegheny website for current students, prospective students, parents, and faculty. The new reporting plan will standardize the process of using survey data and allow academic and administrative departments to make better use of NSSE results.
Improving Writing Across Disciplines
Auburn University has participated in eight NSSE administrations since 2002. While it reviews NSSE results at the institution level to provide a general view of the student experience, Auburn also drills down to specific department data. When comparing its students’ scores to those of students at peer institutions, Auburn identified areas of concern with student writing skills. Coupled with similar results from the Collegiate Learning Assessment (CLA), the institution targeted writing for improvement and launched an initiative that established an Office of University Writing and a formal University Writing Committee. The new committee and newly hired Director of University Writing outlined specific practices to help departments improve the writing skills of their students. These included common program-level practices, such as identifying competencies expected of graduates in the department, and common course-level practices, which provided students with the opportunity to revise their writing after receiving feedback from peers and the instructor.
To further assist departments, the committee and the director facilitated workshops and discussions with faculty on how to better incorporate writing into the curriculum. The workshops covered various topics including strategies for providing effective feedback and developing an assessment plan. Faculty who participated in the 2010–11 workshops explained how they had revised course assignments to include writing, revision opportunities, and rubrics to evaluate writing in disciplinary courses. Faculty members agreed that including writing in their courses reinforced the learning experience they wanted for their students. “Writing promotes ‘deep learning’—the kind of learning that demands both remembering and understanding of relationships, causes, effects, and implications for new or different situations,” said a graduate student in the Department of Kinesiology. A professor of electrical and computer engineering agreed. “I wouldn’t have thought to do some of these things if I hadn’t attended the symposium.” The faculty member developed a writing assignment that asked students to create a written tutorial on information that they got wrong on an exam. His poster included data from a survey he gave students at the end of the term in which they strongly agreed that the writing assignment had helped them learn the material and improved their writing skills.
Additionally, Auburn created a writing-in-the-majors policy, which requires each department to develop its own plan to meet certain standards of writing in the curriculum. Although plans vary based on the department, all plans are required to: (1) provide more than one opportunity for students to practice writing; (2) provide opportunities for students to produce more than one kind of writing; (3) provide opportunities for students to write for different purposes and audiences; (4) provide opportunities for students to revise their written work based on feedback from peers and instructors; and (5) include an assessment plan that uses gathered assessment data to improve writing experiences. One program that significantly revised its writing plan was civil engineering. Although the program has always emphasized writing, the new writing initiative provided an opportunity to further departmental efforts to become more intentional in developing the writing skills of students. In their plan, the department details seven different kinds of writing, five different purposes of writing, and four forms of feedback it includes in its courses. Every required course, specialty elective, technical elective, and senior design project is reviewed to detail what kind of writing in each course, the purpose of the writing, whether or not the writing is assessed, and what type of feedback is provided to students. Civil engineering’s plan and all other approved plans are posted on the Office of University Writing Web site to assist other departments as they work on developing and revising their plans (see Appendix A, Auburn University).
Auburn University monitors progress on the student writing plans through their participation in NSSE and the NSSE Consortium for the Study of Writing in College. By reviewing results on the consortium items and surveying faculty to gain a better understanding of how faculty approach writing in the classroom, Auburn continues to assess and foster improvement in the writing skills of its students. In addition, the University Writing Committee is charged with regularly reviewing the plans developed by programs and the Office of University Writing supports faculty as they make decisions about how to continue to improve student writing and writing instruction provided in the majors. The Office of University Writing has also launched a longitudinal study of faculty conceptions of writing and their practices in teaching writing in upper level courses. The study includes analysis of teaching documents, interviews with faculty, classroom observations of writing instruction, and focus groups with students in those classes.
Focusing on Engagement at the Department Level
Brigham Young University (BYU) participates in NSSE annually to gain a better understanding of student engagement across various departments and the extent to which BYU’s educational goals are being realized. Survey items align closely with the Aims of a BYU Education: (1) spiritual strengthening, (2) intellectually enlarging, and (3) character building, leading to (4) lifelong learning and service. When an academic department comes up for review, the Office of Institutional Assessment and Analysis prepares custom reports focused on engagement at the academic unit/degree level for each department when sample size permits along with comparisons to the scores of other students at BYU and at peer institutions. This allows each department to assess their progress on associated learning outcomes in relation to student engagement.
Many departments share their custom reports during retreats where they discuss what the results reveal about their students, curriculum, and associated learning goals. For example, upon reflecting on the data, one academic unit felt its students’ use of technology was lower than desired. To address this finding, the department placed greater emphasis on integrating technology into the courses it offered and the area degree requirements. Many units have made good use of NSSE data specific to critical thinking, writing, communication skills (written and oral), technology use, and satisfaction. Additionally, items specific to student interactions with faculty (specifically, working with a faculty member doing research) have been examined.
Annual participation in NSSE has allowed BYU to effectively identify emerging trends in the data over time. Additionally, multi-year participation makes possible the mapping of NSSE data to the university’s annual senior survey and alumni questionnaire on many items in selected content areas. Having a repository of multi-year data provides a rich resource for some academic units at BYU who use the NSSE accreditation toolkits to align their NSSE results with accreditation standards and for future campus planning and initiatives.
Developing a Culture of Evidence
California Lutheran University (CLU) participates in numerous external and internal surveys to gather direct and indirect evidence of educational effectiveness at many levels of the university. CLU’s Assessment Committee, comprising senior administrators, faculty, and professional staff, reviews, analyzes, and integrates survey results into reports that inform decision-making. Internal assessment survey results are also actively used for program review. Department chairs and faculty complete review templates and attach survey results and demographics as appendices (see Figure 1).
First-year programs are assessed using BCSSE and NSSE results as part of the Foundations of ExcellenceTM process. BCSSE results and BCSSE-NSSE combined results are used by the Assessment Committee to evaluate the first-year experience and are presented at faculty meetings.
The Office of Student Life is also involved in assessment activities. Given that about 40% of students at CLU are commuter students and 33% are transfer students, with the majority coming from two-year institutions in Ventura County (CA), the Office of Student Life was curious about the level of engagement of commuter and transfer students compared to residential students and those who started at CLU. They reviewed NSSE results and saw a gap in the co-curricular engagement of transfer and commuter students. This finding generated an increase in programs focused on the needs of commuter students and the creation of a peer-mentoring program for transfer students.
NSSE results are widely shared at CLU. The provost and Office of Educational Effectiveness, along with the vice president of student affairs, disseminate NSSE results to CLU’s campus constituents. The provost also presents results to the California Board of Regents. The Office of Educational Effectiveness makes assessment information available on the institution’s Web site. During an annual summer retreat on student leadership, the Office of Student Life brings in institutional research staff, retention staff, and others to share data and help participants work this information into programming.
Dalhousie University’s 2008 NSSE results indicated a need to help first-year students become more engaged academically and form stronger connections to the Dalhousie community. A new position was established in the Centre for Learning & Teaching, through the Office of the Vice- President Academic and Provost, specifically to nurture and develop high-impact student engagement initiatives.
Dalhousie values its overall NSSE results, but breaking results down by program and department helped each faculty review strengths and areas that need improvement. For example, in computer science, NSSE results revealed a need for more active and collaborative learning, so more hands-on, project-driven first-year classes were implemented to help students link theory with everyday applications. Student response to these classes was so enthusiastic, additional sections needed to be added. As a result, the department even saw improvement in second-year retention rates.
The First-Year Experience
Franklin Pierce University (Pierce) has conducted four NSSE administrations and, more recently, administered FSSE to assess quality in undergraduate education. Pierce began with an emphasis on assessing the impact of the required first-year seminar, Individual and Community. The institution revised the seminar in 2008 to provide incoming students with more choices, build greater faculty enthusiasm for the course, and increase curricular commonality via common summer readings, advising, and community service projects. Two of the major common learning goals for the seminar include the development of collaborative learning skills and active involvement in the community. The seminar’s requirement of a number of hours of civic and community engagement activities, which are predetermined by each professor, introduces the university mission of preparing students to become active, engaged citizens and leaders of conscience.
NSSE results showing that first-year and senior involvement in community service and volunteer work at Pierce far exceeded students’ participation at comparison institutions provided confirmation of the learning goal of active involvement in the community and for strengthening students’ responsibility toward and contribution to the community. Student feedback suggested that entering students who had participated in community service in high school did not necessarily expect to continue their efforts in college due to academic demands. However, the first-year seminar requirement created time for community service and positively influenced their continued involvement in service throughout their years at Pierce. Additional efforts to combine NSSE results with a full inventory of student involvement in other high-impact educational practices, including active and collaborative learning, common reading, undergraduate research, and capstone experiences, are part of the university’s program review process.
Applying NSSE Results in Assessment, Accountability, and Accreditation
Georgia State University (GSU) first participated in NSSE seeking an assessment instrument that would go beyond student satisfaction and help measure student engagement in curricular and co-curricular activities. GSU has administered NSSE six times to date and triangulates findings from NSSE with other assessment instruments including BCSSE, FSSE, and the institution’s Survey of Recent Graduates. As a member of the Voluntary System of Accountability (VSA), GSU uses NSSE data for its College Portrait. NSSE results are also used to inform GSU’s internal assessment of critical thinking and writing.
These assessment efforts provide GSU faculty, staff, and administration with a much broader understanding of student engagement—one that includes the perspectives of incoming students, first-year students, seniors who are graduating, and faculty. NSSE results are shared with and used by a variety of stakeholders. For example, the Office of Undergraduate Studies explores retention by comparing NSSE responses of those students who left the institution with those who are still enrolled. This comparison is part of an important initiative at GSU to develop a retention model based on both direct and indirect data.
GSU is also crafting a new comprehensive strategic plan focused on the advancement of undergraduate student success and seeks to become a national model for undergraduate education. NSSE data have informed the way the university has positioned itself as an institution whose students value diversity, academic achievement, and community and global engagement.
NSSE results were used in the preparation of GSU’s Quality Enhancement Plan (QEP) for reaccreditation by the Southern Association of Colleges and Schools (SACS), in 2008. The focus of the QEP was to increase undergraduate students’ critical thinking and writing skills in their major field of study. Upon review by the QEP Leadership Committee, NSSE data revealed that when compared to their Carnegie peers, GSU seniors wrote fewer short papers and felt their undergraduate experience did not contribute much to their critical thinking abilities. The committee found similar results from an internal survey administered each semester to recent graduates that measures learning outcomes and academic program satisfaction. These findings informed the final QEP, Critical Thinking Through Writing, which proposed targeted efforts to improve students’ critical thinking and writing skills in their major field of study.
Encouraging Student-Faculty Interaction
The Center for Excellence in Teaching and Learning at Grand View University (Grand View) was launched in 2005 with Title III grant funds. The Title III activity director/learning specialist was charged with directing programs to improve the retention and achievement of Grand View students. One of the assessment tools funded in the Title III grant was NSSE. The Title III grant allowed Grand View to increase awareness of the uses of the data for assessment as well as promote NSSE results to senior administration for use in strategic planning and benchmarking. Grand View also administers the Noel-Levitz College Student Inventory (CSI) and results from this survey are well embedded in their assessment protocols.
NSSE results have been great conversation starters across campus constituencies resulting in the formation of a team to move beyond simply reviewing the assessment data. Using findings from focus groups with students, the team discovered that first-year students felt Grand View provided a very supportive campus environment; whereas seniors felt that the institution was not providing enough help for them to succeed academically. Another disappointing NSSE result concerned the less than desirable percentage of seniors who indicated they would choose to attend Grand View if they could start over again.
A major initiative motivated by Grand View’s NSSE results is the Faculty-Student Engagement program that encourages faculty and staff members to engage with their students in educationally purposeful activities outside of the classroom such as field trips, cultural activities, academic support sessions, and attendance at conferences. Mini-grants are available to faculty for special programs that are based on specific learning outcomes. For example, an English professor hosted a dinner in her home for students in her cross-cultural communications class, featuring foods from a variety of ethnic traditions, while education students were funded to attend the Iowa Teachers Conference.
Looking ahead, Grand View is determining how to continue to use NSSE results in its assessment plan and has decided on a three-year participation cycle. Grand View is implementing a new core in 2012 and plans to analyze NSSE results over time to assess the impact of the curricular changes on students. The institution is also reviewing the best ways to incorporate NSSE results into its 2014 reaccreditation self-study for the Higher Learning Commission (HLC).
Increasing Faculty Use of NSSE Data
Juniata College can be described as a “data rich” institution. Senior administrators are firm believers in gathering as much data as possible to inform their planning efforts. NSSE results feed into Juniata’s planning efforts and were used in the reaccreditation process, beginning with Juniata’s 2001 self-study for the Middle States Commission on Higher Education (MSCHE), and will be used for their upcoming review in 2012–13. NSSE benchmarks and high-impact practices are integrated into their strategic plan, and results on survey items such as study abroad, internships, and critical and analytical skills will be monitored in their long-range planning.
Faculty members at Juniata have shown increasing interest in NSSE results, and the International Learning Assessment Committee has been charged with reviewing the impact of study abroad. Because a large student cohort participated in study abroad in 2010, the committee plans to examine NSSE results for correlations between study abroad and levels of engagement.
Faculty members have also used NSSE items related to attendance at cultural events—some are mandatory for Juniata students—to study their impact on student engagement. A number of faculty members have expressed interest in pursuing research on NSSE to find new ways to use the data. The faculty Academic Planning and Assessment Committee (APAC) works with the director of institutional research to interpret and disseminate NSSE results to the faculty at large. One expected use of NSSE results is in the periodic review of academic departments.
Results from NSSE and other national learning assessments were also used to evaluate the writing program at Juniata. When compared with their peers, Juniata students were not as effective as desired in their critical thinking and analytical writing skills. In addition, faculty members expressed a lack of confidence in the efficacy of the first-year writing program and about student writing competencies across the curriculum. NSSE results revealing that Juniata students wrote fewer long papers and more short papers than their counterparts at peer institutions informed a large part of the revision of the program.
Kalamazoo College’s NSSE results reveal consistently high results on items that reflect the hallmarks of the institution’s academic and experiential programs. However, when a downward trend was noticed on a particular NSSE benchmark, the institution planned specific action and sought more information through campuswide discussions. For example, student focus groups were conducted to better understand student perceptions of aspects of the supportive campus environment benchmark. Findings from both NSSE and the focus groups informed several policy changes and influenced how student space is designed on campus, including major renovation of the student center. One of the most effective uses of NSSE data has been to shine a light on the experiences of students. In response to SCE (Supportive Campus Environment) results that were lower than desired, Kalamazoo has had in-depth conversations with students in focus groups.
Norfolk State University (NSU) has participated in several administrations of NSSE, BCSSE, and FSSE. Results from all three surveys were used in their Walmart Minority Student Success Grant. Specifically, NSU featured BCSSE, NSSE, and FSSE results to demonstrate the gap between student expectations, student experiences, and faculty perceptions (see Appendix B, Norfolk State University). They paid special attention to in-class engagement and followed up on the topics with the largest gaps, including class presentations and group work, by conducting interviews with faculty and students. Results from these efforts helped the institution realize that attention from faculty was needed to improve the student experience. The grant focused on a faculty-led mentoring program for first-generation students who participate in Summer Bridge. Mentoring clusters of five to seven students, one faculty member, and peer leaders were established to promote collaboration and student success. NSSE has helped to encourage faculty interest in student learning processes and effective ways to contribute to student learning, as well as how faculty can further measure student engagement in the classroom.
Southern Connecticut State University (SCSU) has participated in BCSSE and NSSE since 2004, and is following cohorts of students who completed both BCSSE and NSSE to learn more about their college experiences and persistence toward a degree. They also utilize the National Student Clearinghouse to track students in the cohort who have left SCSU. Their analyses indicate that the non-returning students had a different level of relationships with faculty members, peers, and administrative personnel and offices than did the returning students. At SCSU, one of the two most important predictors of whether students in the cohort persisted to their junior year was the Supportive Campus Environment (SCE) benchmark. The importance of this factor in student persistence is emphasized with faculty and staff who work with students in the first-year experience.
Response Rate Award
In 2010, NSSE wanted to learn more about and document successful efforts by institutions to encourage or increase student participation in the survey. We identified eight institutions with high response rates based on categories of size and control. In addition, we identified Spelman College as having the largest improvement in response rate between a recent administration and 2010.
Spelman College, a private, liberal arts, historically Black college for women, has participated in four NSSE administrations. After experiencing an unexpected decline in its 2007 response rate, Spelman launched a plan to increase this rate by 50% in its 2010 administration. The Office of Institutional Research, Assessment, and Planning implemented a multi-faceted approach to engage the entire campus community, which included the following strategies:
Coordinated joint efforts with the Dean’s Office of Undergraduate Studies to provide incentives for participation
Disseminated campuswide emails on the importance of NSSE participation and weekly updates on response rates
Solicited involvement from the entire campus, particularly atypical areas, such as Alumnae Affairs, Career Placement, and Web Design
Provided visual reminders for students by placing flyers in high-traffic areas, including residence halls and dining areas
Enlisted the support of faculty members
Spelman’s improved response rate is a result of the coordinated efforts of the Office of Undergraduate Studies, including the First-Year Experience (FYE) instructors and senior advisors. Instructors encouraged students to voluntarily participate and emphasized NSSE’s importance to the college’s assessment activities. In addition, several departments promoted NSSE among their senior majors. For instance, sociology, anthropology, biology, dual degree-engineering, and educational studies highlighted the value of student input on the quality of their experience in their classes. These initiatives yielded greater participation and led to a higher response rate.
Increased student participation in the NSSE 2010 administration was important to Spelman because it was completing a 10-year span of assessment that included four NSSE administrations, which allowed Spelman to use multi-year results to: (1) support the college’s reaffirmation of accreditation; (2) strengthen the Sophomore Experience by identifying gaps in FYE; and (3) assess trends in student engagement to improve services and programs. By challenging the entire campus community to improve student participation in NSSE, Spelman was able to significantly improve their response rate from 28% in 2007 to 70% in 2010.
In fall 2009, a task force composed of faculty, administrative staff, and one student was charged with establishing a plan to highlight the “distinctiveness” of the State University of New York Oneonta (SUNY Oneonta) from other comparable institutions. To derive “important attributes” and “distinguishing strengths,” the task force reviewed numerous resources and internal and external survey results, including the Student Opinion Survey (SOS), NSSE, Collegiate Learning Assessment (CLA), strategic planning documents, and enrollment data. Additional information was collected through an email survey of academic department heads and an open forum held for the campus community. Four themes of “distinctiveness” emerged: reputation, engagement, service, and environment. Scores from the SOS from 2009, admissions data, a rigorous assessment program, and participation as an early adopter in the VSA program were used as evidence of SUNY Oneonta’s reputation of excellence in teaching and learning.
NSSE benchmark scores from 2008 provided support that SUNY Oneonta fostered high levels of student engagement inside and outside of the classroom. In addition, NSSE results for seniors on survey items related to technology demonstrated that students were using computer and information technologies more frequently than their SUNY system counterparts.
The State University of New York Potsdam (SUNY Potsdam) used its results from nine NSSE administrations to support its 2010 Self-Study for reaffirmation from the Middle States Commission on Higher Education (MSCHE). Specific NSSE items were aligned with MSCHE standards to report levels of student participation in undergraduate research and service-learning, as well as to measure the degree of student interaction with faculty, administrators, and student affairs personnel. NSSE results were also used to review general education and academic advising at the institution.
SUNY Potsdam has made great efforts to encourage data use at the department level. NSSE results are featured on the institution’s Web site and use of NSSE data has been promoted across campus. Department chairs disseminate disaggregated results in breakout reports and facilitate getting the data into the hands of faculty to help improve pedagogical practice.
The First-Year Experience
Tarleton State University (Tarleton) has administered NSSE on a biennial basis since 2001 as a member of the Texas A&M University system. An ad hoc group of campus leaders holds ongoing discussions to review Tarleton’s NSSE results and compare its scores with other Texas A&M University institutions, institutions within its Carnegie classification, and the annual NSSE cohort.
In 2010, Tarleton administered BCSSE during new student orientation sessions then chose a local NSSE administration in the spring 2011 semester. Combined results from the surveys are being used to continue the assessment of the effectiveness of Duck Camp, a three-day, off-campus orientation program for first-year students designed to assist in the transition from high school to college and promote engagement. The initiative was created in 1995 to help first-year students develop friendships with their peers prior to the start of the academic year as well as learn about the opportunities and activities available at Tarleton. In 2010, approximately one-half of the incoming first-year class participated in the camp. A committee of student affairs, academic affairs, and enrollment management administrative staff has been examining BCSSE and NSSE data and other information about first-year student retention and satisfaction to better understand the camp and other orientation experiences on first-year student engagement. This effort to bring stakeholders from across campus to review assessment data has served as a model for increasing collaboration across the institution. Tarleton staff also hopes that disseminating information about the effectiveness of Duck Camp will promote more partnerships among campus departments and groups.
Teaching and Learning for Educational Effectiveness
Tulane University used NSSE results related to students’ expectations for and involvement in service-learning, undergraduate research, and internships, plus other indicators of students’ interest in public service and research, to establish the warrant for the Center for Engaged Learning and Teaching (CELT). Developed as part of its Quality Enhancement Plan (QEP) for the Southern Association of Colleges and Schools (SACS) reaffirmation, the CELT will be the hub for fostering engagement in four core areas: (1) research engagement; (2) social innovation engagement; (3) classroom engagement; and (4) experiential engagement. Growing out of Tulane’s recognized strength in public service and service-learning, as well as students’ keen interest in engaging in public service programs, the project will expand opportunities for more students and faculty to participate in meaningful, high-impact practices and learning experiences that complement their academic and career goals.
NSSE data related to the activities of CELT will be used as baseline indicators, and future results will be used to monitor student participation and educational effectiveness. For example, NSSE items related to working with other students on projects during class will serve as a proxy for engaged classroom activity, and participation in undergraduate research and service-learning will provide feedback on participation in high-impact activities. Highlights of Tulane’s assessment plan include the mapping of learning outcomes to assessment activities and the use of multiple measures and methods. To assess the extent to which involvement in the CELT activities relates to the learning outcome of “effectively live and work in a culturally complex society,” Tulane will collect evidence using the Association of American Colleges and Universities’ Intercultural Knowledge and Competence rubric and review NSSE results on diverse interactions and gains in understanding people of other racial and ethnic backgrounds. Tulane’s plan promises to create an enriched environment for student learning and promote innovative approaches to teaching.
The University of North Carolina Wilmington (UNCW) has used five administrations of NSSE and one administration of the Collegiate Learning Assessment (CLA) as indirect and direct measures, respectively, to assess and guide revision of its general education core curriculum, the Basic Studies Program. UNCW is an Association of American Colleges and Universities (AAC&U) VALUE (Valid Assessment of Learning in Undergraduate Education) Partner campus, part of a multi-year, national project to develop rubrics for assessing general education learning outcomes. In 2008, UNCW’s efforts focused on developing 37 common learning outcomes that were used to select departments and courses from which student work would be assessed.
CLA scores were used to assess critical thinking and written communications skills. NSSE results were used to establish trends and to plan for longitudinal disaggregation of data by department and school. Concern over less than desirable results on NSSE items relating to integrating ideas or information from various sources also generated a rubric-based plan for assessing information literacy.
The Importance of Advising
To accomplish its goal of improving the effectiveness of advising programs, the administration and advising community at the University of Tennessee (UT Knoxville) examined a number of indicators such as the ratio of students per advisor, information from student focus groups regarding their advising experiences, and a comprehensive program review by external consultants. They also examined student responses on NSSE items that align with the university’s advising program goals and learning outcomes, which include guiding students toward academic support services, programs in service-learning and undergraduate research: Use of academic support programs; Frequency of discussions about career plans with advisors or faculty; Perceptions of the academic experience; Participation in service-learning and undergraduate research; and Frequency of diverse interactions.
A comprehensive campus initiative, Ready for the World, is designed to enhance students’ understanding of intercultural diversity and global affairs. As a result of a two-year assessment process, UT Knoxville has increased the number of full-time academic advisors, restructured orientation advising for first-year students—which includes extended contact with college academic advisors and individual advising sessions—and implemented a new advising policy that targets at-risk students, such as new transfers, students on probation, and those without declared majors.
The University of Texas at Tyler (UT Tyler) participates in NSSE to gather evidence for strategic planning and accreditation. UT Tyler’s 2009–2015 strategic plan, Inspiring Excellence, incorporates assessment of study abroad and global citizenship using NSSE results. Along similar lines, UT Tyler’s Quality Enhancement Plan (QEP), “Global Awareness through Education” (GATE), was submitted in 2010 for reaffirmation by the Southern Association of Colleges and Schools (SACS). The goals of the QEP are to infuse the general education curriculum with global issues and topics, create new student learning communities centered on a study abroad experience, and provide greatly expanded co-curricular activities on campus led by the GATE learning community students and faculty.
The Importance of Advising
West Chester University of Pennsylvania (WCU) participated in NSSE in 2008 and 2010 as a Pennsylvania State System of Higher Education (PASSHE) consortium member. Through consortium participation, PASSHE institutions appended questions on advising and course availability to the NSSE survey. Although WCU student responses in 2008 were mostly positive, the dean of undergraduate studies identified one area of concern—students did not feel they received high quality advising. In response, advising became a major priority for the institution and the University Academic Advising Committee (UAAC) was charged with creating an improvement plan. The plan included a new classification of “internal transfer” to designate students who wish to change majors and those with undeclared majors, and the dedication of two advisors with comprehensive knowledge of all departmental requirements to this group. Orientation sessions for new first-year students, and a handout that describes the responsibilities of students and advisors, helped to clarify students’ understanding of the advising process. To further emphasize the importance of advising as teaching, the institution negotiated with the faculty union to include advising as part of the statement of expectations for faculty performance.
In spring 2011, the UAAC at WCU administered two additional internal assessments, student satisfaction and individual departmental surveys. The UAAC is studying the results, along with data gathered from all other sources, on specific advising needs, topics discussed in advising sessions, accessibility and availability of advisors, and satisfaction with the advising experience. The UAAC also examined the relationship between frequency and extent of advising and student satisfaction with the advising process across departments to develop a series of best practices. Rather than training workshops, faculty-advising liaisons from each department, about half of whom are department chairs, participate in “shared best practices” sessions. The meetings occur once a semester and provide an opportunity to exchange strategies and experiences. Since implementing these initiatives, WCU’s scores on advising-related items from its NSSE 2010 administration have shown improvement.
Focusing on Engagement at the Department Level
Wofford College uses NSSE results to identify strengths and weaknesses in the undergraduate experience and promotes the use of disaggregated data at the department level. Specifically, a campuswide initiative encourages departments to use NSSE data to enhance curricular offerings and improve teaching practices. Departments were asked to review their NSSE results then organize retreats to discuss how their departmental missions and student learning outcomes might be informed by the data. For example, if improving critical thinking is a learning outcome goal for a department, faculty would examine their students’ scores on several NSSE items related to this area. When the data revealed that computer science students were underperforming on presentation skills, the department organized workshops and guest lectures on public speaking. The department of foreign languages correlated results from NSSE with those from formal foreign language assessment instruments to discover that study abroad is strongly related to student engagement and the achievement of desired departmental learning outcomes.
Wofford has used NSSE results in its marketing campaigns and posts results publicly on the home page of its institutional Web site. NSSE results are included in a four-page brochure, Measuring Student Engagement—Learn What Your Student Will Actually Get, distributed to alumni groups, including the Alumni Executive Council, and used by admissions staff with visiting prospective students and high school counselors. An accessible interpretation of NSSE benchmark results and suggested questions that ask—“How does the survey of student engagement work at Wofford and other participating colleges?” and “How do colleges measure their performance in engaged learning?”—help to interpret and explain Wofford’s NSSE results. Finally, on the institutional Web site, www.wofford.edu, there is a prominent link under the “Admissions” menu to information on NSSE, Wofford’s 2010 NSSE results, and a statement on the institution’s commitment to institutional transparency.
NSSE results have helped spark changes in admissions criteria at Wofford College. Specifically, community service and civic engagement are important aspects of student life at Wofford with students engaging in service not only in their local communities but also abroad. For example, many Wofford students have taught in elementary schools in Guatemala or worked in an HIV/AIDS clinic in Paris. As a result of the emphasis placed on community service and civic engagement among undergraduate students, Wofford College has begun to emphasize volunteer experience when reviewing the applications of prospective students.
Lessons from the Field—Volume 1: Using NSSE to Assess and Improve Undergraduate Education
Northern Michigan University
Making NSSE Data Part of a Systematic Assessment Approach
California State University Northridge (CSUN) has participated in NSSE four times over the past five years. Data from its NSSE 2007 administration were widely circulated for the first time on campus by the Office of Institutional Research (IR). CSUN had participated in a paper administration in 2006, as part of the BEAMS project. This yielded relatively small numbers of responses, making results less reliable, so they were not widely circulated. The current Director of Institutional Research, Bettina Huber, opted for the web-only approach in 2007, along with an oversample of first-year (FY) students, with the result that just over 1,900 students completed the NSSE survey. Thanks to these “good numbers,” meaningful subgroup analysis was feasible for the first time. All departments and colleges at CSUN are expected to provide annual planning reports. To assist with this process, the IR office provided tables broken down by college (see Table 1) as part of a general overview of the 2007 NSSE findings presented at a spring 2008 session of the Provost’s Professional Development Series. The NSSE senior data, broken down by college, served as a focus for college-specific discussion groups held at the end of the session. Using the rich data from the oversample of FY students, Huber is currently comparing the progress of FY students who participated in University 100 (an introduction to the University) with those who did not take the class. She is also examining differences in engagement among seniors who entered the university as first-year students or later in their college careers as community college transfer students. Rather than using the NSSE Benchmarks of Effective Educational Practice, CSUN’s IR staff members have developed their own groupings of survey items to inform different educational processes students experience and to evaluate the effectiveness of certain campus services. For example, student responses regarding academic advising have been helpful in locating individuals’ difficulties with advising services within a broader context of fairly widespread satisfaction. Huber has also found the option to select peer comparison groups very useful and looks at the performance of CSUN in relation to other CSU campuses and other large public, primarily nonresidential institutions.
CSUN was among a few dozen colleges in the nation participating in the fall 2007 beta test of the VSA and provides information to the College Portrait, a common web template that institutions can use to meet the following objectives:
Demonstrate accountability and stewardship to the public
Measure educational outcomes to identify effective educational practices
Assemble information that is accessible, understandable, and comparable
CSUN will continue to use NSSE data in future activities such as planned improvements to various university programs and exploring possible differences in the experiences of first-time FY and transfer students. Huber would also like to examine in more detail specific NSSE item clusters, such as those that comprise what she calls “abstract thinking skills” (judgment, analysis, memorizing) and are included in the Level of Academic Challenge benchmark.
Making NSSE Data Part of a Systematic Assessment Approach
Clemson University has administered NSSE consecutively over the past seven years, beginning in 2003. A campus NSSE team was formed to provide faculty and administrative staff with resources and information about how to use NSSE in practice, and how to enhance the campus’s administration. Recently, renewed efforts to share NSSE results across campus and have meaningful conversations about putting the results into practice have begun.
Clemson’s president, James F. Barker, has set an institutional goal to become one of the top 20 public institutions in the nation by 2011. To reach this goal, increased focus has been placed on intentional data collection to ensure that all assessment instruments utilized are providing useful and actionable data. In addition to NSSE, Clemson participates in COACHE, a job satisfaction survey created by the Collaborative on Academic Careers in Higher Education at Harvard University that gathers information on tenure-track faculty, and in the VSA.
For the first time in 2007, Clemson chose to customize its NSSE comparison groups. Eleanor Nault, Clemson’s Director of Assessment, reports that this option has made the data reports much more useful. She has also found the NSSE benchmarks to be very helpful for institutional level analysis. In addition to individual campus goals, the South Carolina State Budget and Control Board requires that all higher education institutions apply the Baldridge Criteria® reporting guidelines used to measure organizational performance. The Board used national criteria for educational quality and adapted them to address the Baldrige Criteria. In its accountability report to the State Board, each institution must benchmark its performance against these criteria. Clemson accomplishes this task by integrating NSSE, VSA, and other institutional data.
NSSE results have been presented to the entire Division of Student Affairs, sparking productive discussions concerning areas where the campus is succeeding in connecting with students, and areas that may require some attention. Stemming from the increase in emphasis on effective assessment measures at Clemson, the new Vice President of Student Affairs, Gail DiSabatino, invited Dr. George Kuh, Director of the Indiana University Center for Postsecondary Research and then Director of NSSE, to campus in the fall 2007. Dr. Kuh suggested that assessment efforts work to identify underengaged students. Clemson’s NSSE data has since been aggregated to identify large enough groups of students to successfully pinpoint characteristics of those who may be underengaged.
Given the number of years the campus has administered NSSE, their pool of respondents is large enough for this method to be effective. Presented with NSSE data, Clemson faculty members expressed concern over student reports of too few in-class discussions that address issues of diversity. The campus has since determined that faculty and students may have been interpreting the question differently. However, preserving the classroom as a safe space for conversations on diversity is very important to the University and faculty have been offered opportunities to learn more about teaching methods to engage students in these types of discussions. In addition, workshops on other types of pedagogical strategies have been developed and offered to faculty members.
NSSE data has been tied to other campus decisions at Clemson. Over the past three years, Clemson has initiated Creative Inquiry Projects—undergraduate research activities where faculty members guide small groups of students through a multi-semester project in various disciplines. Projects are designed to help students develop problem-solving and critical thinking skills, as well as the abilities to work on teams and express themselves effectively in written and verbal communication. A campus press release from last year highlighted how the Creative Inquiry program and other initiatives such as internships and cooperative experiences had, according to Clemson’s 2007 NSSE results, increased the numbers of students participating in undergraduate research to a level significantly higher than institutions in Clemson’s selected peer group.
Looking forward, Clemson plans to use NSSE data to evaluate first-year programs such as living and learning communities. Clemson’s response rate to the survey is approaching a level where the numbers of students involved in these communities are a large enough part of the random sample of its student population that more targeted analysis of their responses will be possible.
Student-Faculty Interaction
Located in a small town on the historic Erie Canal, The College at Brockport, State University of New York (SUNY Brockport) is located 16 miles west of Rochester, NY, and about 45 miles east of Buffalo. SUNY Brockport was one of the founding institutions in the pilot (2003–2004) of the Foundation of Excellence® in the First College Year process and participated in NSSE for the first time in 2004. Since then, the institution has participated in the survey every year. After receiving NSSE results for several years, department chairs at Brockport began to express interest in the survey and ask about the responses of their specific students. To better help faculty serve students, Lillian Zhu, Director of Institutional Research and Planning, utilized the group variable columns in the population file to identify the academic majors of students. Then, she created binders for each department which included NSSE mean comparisons and frequency distributions reports from students in that department over the span of four years compared to the entire Brockport sample. In addition, she and her institutional research (IR) team wrote a one-page summary detailing specific results that department chairs should pay special attention to in both highlighting and improving their efforts.
Zhu and her IR team also provided reports to the Educational Opportunity Program (EOP), Honors program, and the Delta College program, an alternative to the traditional General Education program. Delta College offers students an interdisciplinary approach to required courses with a special focus on career preparation. Students work closely with faculty and take up to 10 classes together as a cohort. Zhu continued working with department chairs and faculty following the distribution of binders. Brockport had also participated in FSSE three times from 2006-2008. Through various presentations to and discussions with school deans, Zhu addressed differences or mismatches present in faculty and student perceptions revealed in comparing FSSE and NSSE results. For example, the amount of time faculty indicated students should be investing in class was very different from the amount of time students actually reported. These discussions have led to the development of several action plans to improve the undergraduate experience at Brockport. With the system participation of all SUNY schools in 2008, Zhu looks forward to making use of comparison data to review Brockport’s performance with other system institutions.
Indiana University-Purdue University Indianapolis (IUPUI), a large public university in downtown Indianapolis, Indiana, has participated in NSSE four times since NSSE’s launch in 2000. To make use of NSSE results, IUPUI has mapped NSSE data to campus-wide principles of undergraduate learning for curriculum and co-curricular development (service-learning, research with faculty, and study abroad) and uses the data as performance indicators in those critical areas. NSSE data are also mapped to performance indicators in strategic planning and institutional improvement, specifically for gains on diversity goals, in technology use and participation in service-learning; corroborating data from other in-house instruments (advising); and evaluating first-year programs. To inform staffing decisions, the Office of Information Management and Institutional Research at IUPUI has presented data from NSSE to the Board of Trustees and various units and departments highlighting the positive engagement/educational impact of on-campus employment for students, and has encouraged departments and units to hire more students to fill staff positions.
Several concrete changes at IUPUI have been motivated by NSSE data. Based on participation results related to service-learning in thematic learning communities, stronger linkages between service experiences and learning outcomes have been created and opportunities for participation in service-learning increased. A new program, RISE, or RISE to the Challenge, was implemented to ensure that all students take part in at least one of the following four high-impact experiences prior to graduation: undergraduate research, study abroad, service-learning, or internship experience.
IUPUI also participated in FSSE in 2006. Based on its results, IUPUI implemented curricular changes focused on diversity; specifically, faculty use of diverse perspectives in the classroom. IUPUI also explored other areas of disconnect between faculty and student responses.
Northern Michigan University (NMU) is a public university with 9,400 undergraduate and graduate students. Northern, located in the Upper Peninsula, is also one of three universities in the state of Michigan to serve a community college role for its region. NMU is noted for its focus on using technology in higher education and is one of the largest notebook computer campuses in the US. Full-time students receive either a ThinkPad or iBook as part of tuition. The Associate Vice President of Institutional Research at NMU, Paul Duby, chose to participate in NSSE because he felt it is a survey instrument that measures holistic and affective learning processes. NMU places great emphasis on encouraging students to get involved in service-learning. The Superior Edge program, which currently has over 1,500 students enrolled, combines community engagement, diversity awareness, leadership development, and real-world experience. Duby considers NSSE the best instrument to assess the impact of service-learning through more meaningful constructs of processes and outcomes.
Using NSSE Results to Study “Sophomore Slump” Pace University has participated annually in NSSE since 2002. Results have been shared extensively with the Board of Trustees, Presidents’ Council, and senior administrative councils. The Provost’s office has placed special emphasis on sharing results with faculty and the entire university community. NSSE results have not only been shared, but have been acted on and incorporated into various institutional assessments. Very early on, the Office of Planning, Assessment, and Institutional Research along with the University Assessment Committee teamed up with the Pforzheimer Center for Faculty Development and the Center for Teaching, Learning, and Technology to present Faculty Development Days to review NSSE results. The programs prompted discussion among faculty concerning NSSE Benchmarks of Effective Educational Practice such as Academic Challenge, Active and Collaborative Learning, and Student-Faculty Interaction. Best practices were also shared.
In addition to sharing NSSE data with the various administrative councils, individual deans and department heads requested presentations on the results for their department faculty and staff. Each year, interest in “how we are doing” grows within the institution. Several NSSE items helped assess Pace’s progress in achieving specific goals of its strategic plan in which a special emphasis was placed upon the goal of “student-centeredness.” NSSE items also were easily adapted to the goals and objectives of specific programs and initiatives such as: measurement of progress in service-learning, development of capstone experiences, participation in study abroad, reaffirmation of Pace’s commitment to a diverse learning environment, and increase in positive student self-reports in mastering the learning objectives of the 2003 Core Curriculum. Some specific examples follow that further illustrate how NSSE results have been used by Pace for institutional improvement.
Improving the Sophomore Experience
Pace University had long provided coordinated programs for first-year students to promote their success. These efforts seemed effective as evidenced by a stabilized first-year retention rate of 76–77%, beginning with fall 2000 cohort. However, no special initiatives or programs addressed the needs of students in their sophomore year and there was growing concern over a retention rate that after two years dropped off by more than 9%. Motivated by this persistence data and the success of the first-year experience, the “Sophomore Working Group,” comprised of faculty, academic administrators, and student affairs professionals, began to focus on developing a special program or “experience” for sophomores.
In reviewing NSSE 2004 first-year results, the Working Group sought to better understand areas where Pace was doing well and those that needed improvement in students’ relationships with faculty, other students, administrators, and staff. Schreiner and Pattengale’s (2000) Visible Solutions for Invisible Students: Helping Sophomores Succeed provided the group with additional insights into the sophomore year. They found that the phenomenon of “sophomore slump” corresponded with a number of NSSE questions, so the Working Group incorporated these items into a short survey that was administered to sophomores to assess the extent to which students might be experiencing this phenomenon. Findings from student responses to this survey revealed, for example, that relationships with faculty played a critical role in students’ assessment of their educational experiences and achievements, and that specific bureaucratic procedures for registration, financial aid, and payment of fees were a source of frustration for students.
Sophomore focus groups were also conducted to further contextualize NSSE responses to the Pace environment. Focus group findings were consistent with previous focus groups conducted among a larger sample of the general Pace student population, and indicated sources of students’ satisfaction and their key reasons for attending and remaining at Pace. Specific actions and programs resulted from the findings of the Sophomore Working Group, including the development of comprehensive transition and support programs for sophomores such as the “Pace Plan”, a comprehensive advisement model for both academic and career advisement, and the expansion of faculty mentoring opportunities to increase quality interactions with faculty, and restructuring the registrar, bursar, and financial aid office.
Digging Deeper: Examining Variation Within and Over Time
In an effort to provide usable results to each school or college, Pace conducted a local NSSE administration in 2005. This provided larger samples for each of its schools and resulted in a more insightful profile of their students’ engagement experience. The Office of Planning, Assessment, and Institutional Research used NSSE results to determine whether there were significant differences in the engagement experience of students across schools and colleges.
Pace used NSSE data to carry out additional studies on the experiences of transfer students compared to native students; commuters compared to resident students and first-generation students. In addition, NSSE results have been used by professional schools at Pace in their accreditation efforts with AACSB, ABET, CCNE, and NCATE. Pace also incorporated NSSE results in its Middle States Self-Study in preparation for a spring 2009 reaccreditation visit. Pace also looked over time at two satisfaction questions on NSSE to identify relationships between engagement practices and membership in one of the two extreme satisfaction groups, “Low Satisfaction” defined by “poor or fair” rating and “definite or probable would not repeat experience” versus “High Satisfaction” defined by “good or excellent” rating and “probable or definite would repeat experience.” Although results indicated that the trend in most areas was one of improvement, the percentage of unambiguously satisfied students (i.e., those who found the experience satisfactory and would attend the same institution) hovered steadily between 65% and 70% over a five-year period, compared to Pace’s Carnegie peers who consistently demonstrated higher scores.
In all, 37 engagement activities correlated positively with student satisfaction and perceptions of the Pace experience. The analysis demonstrated that the engagement activity most strongly correlated with student satisfaction was the quality of academic advising. This was followed by “provided the support to help you succeed academically,” “quality of your relationships with faculty members,” “course work contributed to acquiring a broad general education,” and “quality of your relationships with administrative personnel and offices.”
The University Assessment Committee disseminated the findings of the five-year student satisfaction analysis as widely as possible, beginning with the University’s leadership—all members of the President’s Council which included the Vice President of Student Affairs, and all members of the University’s management team. Results from the five-year study and a report highlighting Pace’s NSSE results were also shared with the Board of Trustees. Faculty members were a prime audience for the satisfaction results since many of the activities identified were within their control. Because faculty members are often faced with reports of what is wrong, the Assessment Committee thought it was especially important for them to see what was “right.”
The Assessment Committee published a newsletter reporting on the five-year study of NSSE results that was sent through the provost’s Listserv to all Pace faculty in late March 2007. To follow up, a workshop on the study was presented at the annual Faculty Institute in May.
Results Influence Revision of Freshman Seminar
The influence of the five-year satisfaction results fed directly into an issue getting a great deal of attention and concern at Pace: a proposed revision in the Freshman Seminar, UNV 101. One of the most important changes proposed was to have full-time faculty from each of the schools and the college teach the UNV 101 course. In the past, professional staff and long-time adjunct faculty taught the seminar along with a handful of full-time faculty. The NSSE student satisfaction results provided additional evidence for the associate provost to convince deans and full-time faculty that the assignment of fulltime faculty to UNV 101 would have a significant impact on the first-year experience. As the instructor of the seminar also served as the student’s advisor, a second change extended the advisory role of the faculty member from a one-semester to a year-long relationship with the student. First-year students would be assigned to seminar sections based upon their professional school or college selection. As a result, first-year students would come into early contact with a full-time faculty member from their school or college in a meaningful advisory relationship. With the help of NSSE evidence to strengthen the proposal, fall 2007 UNV101 sections benefitted from the expertise of 57 full-time faculty members.
Results Inform Reorganization of Student Services
The satisfaction study, which identified that “quality of your relationships with administrative personnel and offices” contributed to student satisfaction, and the sophomore survey results that revealed the need for improvement in student services, particularly the Registrar, Bursar and Financial Aid, made a strong case for the creation of “one-stop services.” In 2007, these offices were restructured and renamed the Office of Student Assistance. A new administrator was hired to oversee the operation and a new series of assessments was performed to identify the most pressing problem areas. Pace’s president was keen on using engagement results for improvements and made student satisfaction a high priority. He extended Pace’s commitment to the improvement of service delivery and has supported formal programs to empower Pace staff to take greater responsibility for resolving student problems. Student engagement data provided Pace University leaders with empirical evidence of areas where action and change was needed.
Making NSSE Data Part of a Systematic Assessment Approach
Peace College is a private, liberal arts college for women located in Raleigh, North Carolina. Faculty-student interaction is central to Peace’s mission and NSSE results serve as a gauge of how well they are doing in fulfilling that mission. Student-reported data on the extent to which they are challenged in their coursework is of major concern to faculty and administrators at Peace and significant focus is placed on the NSSE benchmark, Level of Academic Challenge. Two outcomes of this focus were the addition of a required statistics course to the general education curriculum and continuing conversations regarding pedagogy. Starting in 2008, Peace began comparing student performance from the Collegiate Learning Assessment (CLA) along with NSSE to gather more in-depth data on this benchmark.
Having recently completed its eighth administration of NSSE, Peace had made the reporting of NSSE results a part of the campus culture and an expected component of the campus’s regular assessment plan. In addition to use as an assessment tool, Peace’s marketing office has used NSSE results as a public relations resource for the campus. Dr. David B. McLennan, Associate Dean for Institutional Effectiveness and Professor of Communication and Political Science, issues annual reports to the institution’s curriculum committee and presents NSSE data to the faculty several times each year, charging them to review specific aspects of the data. While these groups are presented with detailed information, senior administrators at Peace receive a broad overview of all NSSE results. Even though NSSE results are widely disseminated across campus, Peace would like to dig deeper into its NSSE data and plans to begin more comparative analyses.
Empowering Institutional Stakeholders to Convert NSSE Results into Action: Diversity, Student Affairs, Faculty, Communicating Results Texas A&M University (TAMU) was opened in 1876 as the first public institution of higher learning in the state. Currently TAMU has a student enrollment of 46,000-plus students (8,500 are graduate students). About 25% of first-year students are the first in their family to attend college NSSE data have been used in a variety of ways at TAMU. Texas law requires public universities to report student satisfaction data to the state. Since 2001, the Texas A&M University system has used NSSE as a tool to report these data. Associate Director of Measurement and Research Services, Mark Troy, sends out NSSE results to all Deans as well as a custom-tailored college-level analysis of the data. Several colleges, such as Agriculture and Liberal Arts, have used NSSE data for their institutional effectiveness reports.
Several task forces at TAMU were established to better measure the institution’s progress in serving students. One of these task forces focuses exclusively on writing. In 2005, another task force identified 20 characteristics a TAMU graduate should possess, one of them being writing effectively. However, when compared to other institutions, the task force found that TAMU students were not performing as well as their selected peers on this characteristic. After targeted analyses of specific NSSE item responses related to communication, TAMU established the University Writing Center, a student calibrated peer-review program, and “W courses”—courses with intensive writing components in many majors. Similar targeted analyses of NSSE items related to student research have been conducted in support of TAMU’s initiative to enhance the undergraduate experience through inquiry/research-based education.
To share NSSE results and encourage campus-wide interest in the assessment process, Troy has made presentations on NSSE to the University Assessment Committee, a group that deals with all assessment-related topics for the University and to some of the college assessment committees. Troy has also found the NSSE pocket guide to be a very useful tool for sharing NSSE results. He and his staff pulled out TAMU’s results related to the guide questions and compiled a report which was sent across campus to academic advisors and admission officers.
Making NSSE Data Part of a Systematic Assessment Approach
The University of Denver (DU) has a tradition of administering the NSSE survey and Janette Benson, the Director of the Office of Academic Assessment, hopes to enhance the tradition by looking at new ways to utilize NSSE results. The oldest private university in the Rocky mountain region, DU serves almost 5,000 undergraduate students. Based on previous NSSE benchmarking results, the college has high levels of engagement in comparison with its peers. Benson believes DU’s NSSE results have the potential to provide the University with much more than comparative information.
Benson is enthusiastic about including analysis of NSSE results as one component of the institutions’ larger academic assessment plan. Drawing on her background as a cognitive developmental psychologist, Benson plans to disaggregate NSSE results. “We should be using NSSE as part of a direct assessment of what the University of Denver is doing on campus for different groups of students,” said Benson. As a faculty member and administrator, she is optimistic that she will help DU’s faculty continue to find value in using NSSE results to learn more about their students.
Institutional Research and senior administrators were particularly excited about DU’s 2008 BCSSE administration. Benson hopes participating in both NSSE and BCSSE will allow the University to collect longitudinal data on incoming students. She believes the institution could benefit greatly by looking at both students with low- and high-engagement scores in NSSE and examine their previous high school experiences and expectations for college.
In the future, Benson plans to examine the types of learning that occur in different educational programs at DU. She will begin by assessing some of the key features of general education unique to DU. Using evidence from the field of cognitive developmental psychology that suggests a higher level of learning goes on in areas where students have the most motivation and expertise, such as their majors, Benson hopes to dig deeper and break down NSSE results by major area of study. She believes the outcomes within the majors might be a better indicator of what students are actually learning.
NSSE comparison reports are a beneficial part of the overall NSSE survey results, according to Benson, and she uses them to benchmark DU’s performance against other schools. By making a commitment to incorporate NSSE into the overall institutional assessment plan, Benson and others will be able to use the data for more targeted analysis. She believes this approach will eventually help DU fully understand how to best educate students in accordance with its institutional mission.
Making NSSE Data Part of a Systematic Assessment Approach
The University of California (UC) Merced is the first new American research university of the 21st century. NSSE was a part of the UC Merced assessment strategy beginning with its very first student body in the 2005-2006 academic year. In addition to fulfilling accreditation requirements, the administrators at UC Merced saw a need for data to track progress over time as both the students and the campus develop and grow—or formative feedback according to Nancy Ochsner, Director of Institutional Planning & Analysis.
Prior to beginning their positions at UC Merced, key administrators such as Ochsner had worked with NSSE data at other institutions. When they were hired by UC Merced and tasked with opening a new university, these administrators immediately turned to NSSE as a source of credible data to help them monitor student support services and encourage faculty to embrace a holistic view of students’ UC Merced experience. In particular, administrators involved in assessment and planning hope to use NSSE data to help faculty and staff understand the student experience and maintain effective academic and co-curricular connections with students. UC Merced has not yet received a Carnegie classification. In the future, the institution will be classified as a research university. However, for now, the ability to compare their institution with other institutions participating in NSSE is critical to UC Merced administrators. The customization of the peer comparison groups is particularly important as the institution grows and expands. They want to be able to benchmark UC Merced experiences both with research universities (mostly much larger) and with selective liberal arts colleges (more similar in size).
NSSE allowed the campus to define both groups for comparisons and, as UC Merced develops, they will redefine their comparison groups appropriately. UC Merced has received data from two past NSSE administrations and is participating again in 2009. Results already reflect the unique UC Merced student experience. For example, during the first academic year of operation, there were no classrooms on the campus and classes were held in the library as facilities were being built and opened for use. Responses to NSSE reflected that experience. NSSE responses also mirrored student responses to other surveys conducted by UC Merced, including the system-mandated University of California Undergraduate Experience Survey (UCUES).
UC Merced administrators carried out additional analyses to further confirm that NSSE results reflected the student experience on campus. They disaggregated NSSE data using different demographics to understand the experiences of a number of selected groups such as first-generation and transfer students, students of different races or ethnicities, and students in different majors. UC Merced considers these analyses, made possible by using NSSE raw data, to be essential for an institution with such a diverse student body.
UC Merced has also made use of other NSSE resources, particularly the PowerPoint presentation included with the Institutional Report and research papers and presentations available on the NSSE website. These materials have helped administrators and staff make sense of the large amount of data returned to NSSE participants, and to share results with other campus audiences. In sorting through the data, Ochsner found it helpful to focus attention on effect sizes since there was so much information to process.
Thus far, UC Merced administrators have shared their NSSE data with senior administration, including the Chancellor’s Cabinet and various deans, and students, including the Vice Chancellor for Student Affairs Advisory Group. “The students at UC Merced get excited about the findings too,” Ochsner told NSSE staff. “The goal is to familiarize them with survey data,” she continued. Some next steps for Ochsner and other administrators involved in assessment include beginning discussions with faculty on using NSSE data and working on their partnership with the newly formed Center for Research on Teaching Excellence.
Student-Faculty Interaction
The University of Cincinnati (UC), a public research university in Cincinnati, Ohio, uses NSSE results to assess ongoing initiatives and establish new ones. In response to student satisfaction and technology use scores, the university established a “One-Stop Service Center” and provided students with 24-hour access to selected computer labs. UC has also used NSSE results to inform curriculum planning as it has expanded learning community offerings. Additionally, NSSE participation and data have been used to fulfill assessment requirements for the Ohio Needs Grant, a student success grant with funding tied to institutional improvement and new program evaluation.
When asked about the primary use of NSSE data, Caroline Miller, Senior Associate Vice President and Associate Provost for Enrollment Management, replied, “We use it to inform staffing decisions and to determine student satisfaction levels and the quality of services and experiences (academic and social) students have—particularly in regards to diversity matters.” NSSE results are shared on campus with individual colleges and student affairs units, enrollment management, and committees and task forces. These groups use NSSE in conjunction with other data to assess specific areas such as recruitment, retention, student satisfaction, and involvement and participation levels based on race and gender. NSSE results are also regularly included in the President’s Report Cards, www.uc.edu/reportcard, a publicly-available document published for the Board of Trustees intended to show university performance on key indicators.
The University of Dayton (UD) is one of the nation’s 10 largest Catholic universities and Ohio’s largest private university, with an enrollment of 6,800 full-time undergraduates and more than 70 academic programs in arts and sciences, business administration, education and allied professions, engineering, and law. UD participated in NSSE in 2004, 2005, and 2007, which will allow the institution to identify student engagement trends over time and support evaluation of responses by subgroups of students who completed the survey both in their first-year and senior years. NSSE results along with other assessment data will help the University draw a more complete picture of its students and programs.
Academic divisions and departments have used NSSE analyses to identify areas of strength and possible areas of concern. Divisional deans received reports of student engagement results in specific colleges as compared to all other students at the institution and for individual departments compared to other students in the division. By drilling down into the data, institutional leaders gained a profile of their students in various majors as well as a comparison to students in other departments and divisions. For example, the institution examined differences in the level of engagement for first-year students who persisted at the university with that of those who withdrew. The findings were not surprising—students who persisted at the institution spent more time with instructors, felt they got more feedback on assignments, and participated more frequently in classes. These data helped define a basic core of experiences that contributed to students’ success. The School of Engineering at UD used NSSE data to assess their approach to first-year advising. When comparing student ratings of advising on NSSE prior to and after program changes, the school decided to keep the new advising system for now. They will continue to monitor students’ ratings on advising.
Empowering Institutional Stakeholders to Convert NSSE Results into Action: Diversity, Student Affairs, Faculty, Communicating Results
The University of North Carolina, Wilmington (UNCW), enrolls over 10,000 undergraduates who pursue 73 baccalaureate programs and, for over 10 years, has been recognized as one of the top public universities in the South. A recent conversation with university leaders from institutional research, academic affairs assessment, and student affairs assessment revealed how this institution has empowered various stakeholders to use NSSE results to take meaningful action in their respective areas. For UNCW executive leadership, NSSE provided a snapshot of student engagement at the university as well as a comparison of UNCW students with self-selected peers. Assistant Vice Chancellor, Lisa Castellino, who coordinated NSSE administration and oversaw the dissemination of findings, discovered that visually representing NSSE results was highly effective for large and varied audiences. Castellino used graphics to cluster UNCW results on the five NSSE benchmarks and to represent its survey scores as compared to peer institutions. She used arrows to indicate whether the institution’s mean was above or below their comparison groups and different colors to denote the strength of differences. Her visual presentation of findings along with a summary of areas of strength, progress, and mixed performance helped make the data easily understood by all campus audiences.
The Office of Institutional Research and Assessment at UNCW recently made NSSE data available through a secure server. The Office had been challenged to respond to various requests by university assessment experts for specialized analyses of NSSE data. By providing server access to NSSE data, campus assessment professionals were enabled to conduct their own analyses related to specific functional areas. NSSE data has now become a living resource for decision making in the university. Future plans include providing access to multiple years of NSSE data and adding more UNCW-specific information, such as student residence, academic major, grade point average, and standardized test results.
Additionally, the Division of Student Affairs has used NSSE data as an impetus for improvement in the areas of advising, diversity, and co-curricular activities. In an initiative focused on career planning and advising, the Division found that survey responses from students of color and those majoring in physical sciences and engineering indicated that they were less likely to consult with faculty or advisors about their career plans. The Division recruited more career advisors of color, developed a mentoring program for minority students, and increased staffing to reach out to physical sciences and engineering majors. NSSE results related to diversity also showed variation in the frequency that students from different racial and ethnic backgrounds participated in conversations with diverse peers.
These findings prompted the Division to host diversity workshops and conferences, to increase support for select subpopulations, and to create new staff positions for advising multicultural organizations and conducting multicultural programming. Finally, NSSE findings and other feedback led to a UNCWeekends campus initiative to increase co-curricular engagement.
The Watson School of Education at UNCW has used NSSE data to develop summary reports that compare student engagement results in individual departments with all other students at the university. NSSE data on diversity have also been made available to the Watson School Diversity Committee. These data were used as part of the basis for discussions in the diversity committee that led to the planning and implementation of a diversity showcase. By combining and averaging three years of NSSE results, academic departments with smaller majors had additional respondents, which provided more reliable measures. These reports, organized around the five NSSE benchmarks, offered descriptive summaries and item level frequencies for first-year and senior students. While NSSE findings are becoming more integrated in the decision making process, the response rate seems to be an obstacle to overcome for academic units like the Watson School of Education.
Promoting Student Engagement Through Shared Leadership and Collaboration
The University of Tulsa (TU) is a private doctoral degree granting university with an average student enrollment of slightly over 4,100. Of this total, approximately 3,000 are undergraduates. The campus is close to downtown Tulsa, OK, an urban center with a population of 550,000. The University’s mission reflects the core values of excellence in scholarship, dedication to free inquiry, integrity of character, and commitment to humanity.
The decision to participate in NSSE was made by the Vice-Provost for Academic Affairs in 2001. Institutional administrators were motivated to learn more about the experiences and expectations of their students and to discover if faculty and staff impressions of students were accurate. TU participated in NSSE in 2001, 2004, and 2007; and in Faculty Survey of Student Engagement (FSSE) in 2004 and 2007. In addition, the College of Law administered the Law School Survey of Student Engagement (LSSSE) in 2004 and 2007.
Winona Tanaka, a Clinical Associate Professor from the TU College of Law, is the current Vice Provost and Associate Vice President of Academic Affairs whose responsibilities include heading up administration of NSSE and working with survey results. Over the past five years, she has actively promoted the use of NSSE results for assessment and planning across campus. In addition, after attending a NSSE users workshop, her office has provided funds for two faculty members, the dean of students, and several senior administrators from enrollment services and university assessment to attend additional NSSE workshops. Dr. Alex Wiseman, an Assistant Professor in the School of Education, attended a NSSE workshop and later delivered presentations to the Student Services staff at TU using many of the materials—such as handouts, slides, and exercises—he had gathered in the NSSE sessions.
The culture at TU is to “work together” across division lines. Tanaka has presented NSSE results in the Dean’s Council, at the annual campus-wide meeting of faculty and staff, to Student Services staff, and to internal HLC accreditation committees. Presentations focused on using NSSE results as indirect measures to support selected standards in TU’s self-study, a component of the HLC reaccreditation process. NSSE data was used to affirm a number of assertions in the self-study. Professor Tanaka uses NSSE benchmark data for broad comparisons. When carrying out analysis of specific areas, she frequently uses disaggregated raw data. For example, TU prides itself on the quality of relationships between students and faculty. Using student responses from selected NSSE items, the vice provost was able to present to the Dean’s Council a revealing look at student-faculty interaction on campus.
Although NSSE results were mainly used for reaccreditation purposes, the Admissions and Student Services offices were very interested in student responses that Tanaka had pulled out for the self-study on survey items 7d and 7g, which ask students if they plan to or have: a) worked with a faculty member on a research project, or b) pursued an independent study or a self-designed major. Admissions and Student Affairs plan to use these data along with scores on diversity items in recruiting materials. These offices have created an advisory board of high school counselors from across the country. TU funds the counselors to come to campus for an annual meeting to gather their expertise on meeting the needs of first-year students. Responsibilities for assessment have moved from the Vice Provost for Academic Affairs to the Director of University Assessment over the past year. University Assessment hopes to collaborate with a faculty member in each college who will serve as a champion for helping their colleagues understand the importance and usefulness of NSSE data for analysis at the college level.
Effectively Communicating NSSE Results to Internal and External Stakeholders
One of Sister Georgia Christensen’s first goals after being named Viterbo University’s new Director of Institutional Research and Assessment (IRA) in 2000 was to attend workshops and presentations to learn how to use institutional data in assessment and to provide feedback to faculty members about their students. At that time, Viterbo participated in a number of national surveys but used its results mostly for marketing-related activities. As she became more informed, Christensen was convinced that NSSE would reveal more useful information for assessment than the current surveys in use. She presented her findings and suggestions to senior administrators who then decided to participate in the NSSE 2006 and 2007 administrations. Since Viterbo also wanted to gather data on the experiences and expectations of its first-year students, as well as data on how faculty perceived students, the institution administered the BCSSE pilot in 2005, and FSSE in 2007.
Sharing NSSE Results
Sr. Christensen attempts to be consistent in her use of NSSE, FSSE, and BCSSE data. She uses NSSE benchmark data for presentations and has shared results with the Board of Trustees, faculty at an in-service session, and with administrative and staff assemblies. The Board of Trustees places great importance on Viterbo’s performance compared with other institutions and has developed a list of peer schools that Christensen uses for benchmarking. Not all schools on the list participate in NSSE but she selects those that do as a “selected peers” column to compare with Viterbo’s scores and with the entire NSSE cohort. For faculty presentations, Christensen has focused on survey item results related to active learning. For administrators and staff, she has presented results on items related to an enriching educational environment and stressed their role in creating that environment. She notes, “It’s nice to stand in front of the campus community and say, ‘This is your effect on students.’”
In addition to the groups above, Christensen has worked on analyses of survey data with the vice president of Student Development, the Offices of Communications and Marketing, and Admissions and Enrollment Management. The president of Viterbo has been an active supporter of NSSE as a measure of educational quality as opposed to a ranking system. He published an article in the local paper to explain why rankings on test scores and other external factors are not useful for judging the academic quality of an institution. NSSE results are publicly posted by the IRA office on the Viterbo website, www.viterbo.edu/Assessment.aspx.
Demonstrating Effective Diversity Initiatives
Viterbo University, a private university located in La Crosse, Wisconsin, is committed to Catholic Franciscan values and its mission to provide each student with a quality liberal arts education rooted in the values of human dignity and respect for the world. The institution participates in a NSSE consortium of Catholic schools. Christensen is interested in how being in a Catholic school affects students’ lives. Participating in a consortium helps Christensen understand what things are “special” about Viterbo—she feels religious affiliation creates special conditions.
Although grounded in a Franciscan tradition, Viterbo defines itself as an ecumenical university where diversity is an important core value. All undergraduates are required to take six hours of coursework chosen from the 81 courses in 19 departments that meet the diversity learning component. NSSE results have indicated that Viterbo students, in comparison to their selected peers, scored more highly on learning about diverse perspectives as a result of class discussions and written assignments that have intentionally incorporated different racial/ethnic, religious, gender-related, and political perspectives.
Intercultural study and exchange experiences enhance the Viterbo curriculum and foster diversity. With the assistance of a Title VI grant in 2006–2007, Viterbo University started a Latin American Studies Program. The Global Education office promotes study abroad programs to students and assists faculty in finding international opportunities for professional development.
NSSE Results Influence Pedagogy
As part of a Title III Program, Viterbo faculty members have increased the use of active learning strategies and technologies to create a learner-centered classroom. Faculty participated in intensive active learning workshops during university in-service and out-service weeks from 2004 to 2008. All had access to a Title III “Coach” who was trained in active learning teaching strategies and who reviewed faculty projects, observed their teaching, and finally evaluated the faculty member’s practice. Faculty submitted progress reports to the Title III Director and Coaches. NSSE results from both 2006 and 2007 reinforce the effectiveness of active learning strategies at Viterbo—students’ responses indicated they learn more when they are intensely involved in their education, asked to think about what they are learning in different settings, and collaborate with faculty and other students on projects.
Using NSSE in Higher Learning Commission-North Central Association Accreditation
Viterbo used survey data throughout its HLCNCA Comprehensive Self-Study. Two targeted areas where NSSE results established evidence to meet accreditation standards were diversity and active learning strategies (see above). Christensen also used NSSE Institutional Report data reports and supporting documents, raw data files, and NSSE’s HLC-NCA Accreditation Toolkit as additional resources to support the self-study. Her presentations at the HLC-NCA annual conference in April, 2007, and at the 2007 annual meeting of the Association for Institutional Research in the Upper Midwest (AIRUM) on “The Role of the Institutional Researcher in Accreditation,” focused on preparing NSSE data for multiple audiences and using institutional data in the accreditation process. For example, in one PowerPoint slide, Christensen included a chart she had created that displayed Viterbo’s NSSE results mapped to HLC-NCA accreditation standards.
A private, comprehensive liberal arts institution located in Salt Lake City, Utah, Westminster College enrolls approximately 2,000 undergraduates. The college has administered NSSE 7 times, FSSE 4 times, and BCSSE once since 2001, and utilizes combined survey results to better understand student engagement on campus and to effectively plan for the future. Paul Presson, Associate Provost for Institutional Research and Assessment, finds that comparisons and analyses using what he terms the “linked” data from all three surveys to analyze engagement trends over time is essential to its Westminster’s long term planning. He explains, “We take it seriously. We started a new process of what we call ‘effectiveness retreats’ where we bring in senior staff, deans, board members and we spend half a day looking at NSSE, BCSSE, and FSSE and the senior surveys, alumni surveys, career center surveys—I spend basically the whole fall semester looking at those findings.”
FSSE results have provided a point of reference to understand students’ engagement at Westminster. Most NSSE and FSSE comparison results have been consistent. However, in some cases NSSE and FSSE results identified mismatches between student and faculty responses. To provide an example of how Westminster has integrated its survey data, NSSE results indicated that students felt satisfied with their level of contact with faculty for academic advising but did not feel they were getting enough career advising and general emotional support. The college looked at FSSE results to verify that faculty perceived that students felt these types of interactions were important, then implemented a new career advising program and created a learning community requirement for first-year students. Westminster will use BCSSE results to further refine and monitor its understanding of student needs for institutional support.
Student-Faculty Interaction
Wittenberg University is a private liberal arts college located in Springfield, Ohio, with an undergraduate enrollment of 1,950 full-time students. The institution promotes student engagement through shared leadership and collaboration. President Mark Erickson created the President’s Task Force to study student engagement in the academic and co-curricular environments on campus. Along with the task force, three other committees were formed to focus on the long term institutional goals of education and communication, social context and values, and community standards and compliance.
For the Wittenberg task force, student engagement informed a rubric that targeted efforts on student learning and academic growth. To advance these efforts, the student engagement committee developed action plans based on the Inventory for Student Engagement and Success (ISES) (Kuh, Kinzie, Schuh, & Whitt, 2005), a self-guided framework for conducting a comprehensive, systematic, institution-wide analysis; carried out more in-depth analyses of their NSSE data; and followed-up these activities with a climate study. It was hoped that such efforts would provide evidence to show whether Wittenberg had increased levels of student engagement. The institution also intends to study engagement trends over time, to compare their NSSE results with selected peers, and to consider how other colleges engaged faculty as key partners in the assessment process.
A challenge Wittenberg faced was encouraging faculty investment in the student engagement concept. Leaders of the student engagement committee carefully chose faculty representatives from across the campus who had a strong commitment to students and to service. As they began to understand that student engagement was rooted in academics, the selected faculty members became more invested in the charge of the committee. Faculty then carried out a particularly useful exercise using several prompts from ISES framework to identify functional areas of the institution that helped to strengthen and promote student success. They talked with students, faculty peers, and administrators about these areas to further promote understanding of the concept of student engagement. These discussions were felt to increase commitment to student engagement among faculty, administrators, and students at Wittenberg.
Integrating NSSE Results with Institutional and Other Survey Data
Youngstown State University (YSU), which celebrated its centennial year in 2008, is a comprehensive public university of 13,500 students who are recruited primarily from the metropolitan area in which it is located. YSU offers over 100 undergraduate majors, 30 masters programs, and doctorates in educational leadership and physical therapy. In 2000, YSU introduced a goal-based general education program that includes writing, oral communication, and critical thinking requirements, as well as a senior capstone course. In 2008 YSU received reaccreditation by the North Central Association of the Higher Learning Commission.
YSU has used NSSE data for assessment and reaccreditation. YSU has triangulated NSSE data from 2004, 2006, and 2007, with institutional and other national survey data and reported these results as part of YSU’s participation in the Voluntary System of Accountability (VSA) project. The VSA, an initiative of the American Association of Colleges and Universities (AASCU) and the Association of Public and Land-grant Universities (APLU), provides information on the undergraduate experience through the College Portrait. Specific NSSE items fall into broad categories of “group learning experiences, active learning experiences, experiences with diverse groups of people and ideas, student interaction with campus and faculty, institutional commitment to student learning and success.”
Results on these items are included on a template designed for Ohio’s College Portrait/VSA project, www.ysu.edu/institutional-research/ysuvsa0809.pdf. Faculty and staff reviewed VSA project data along with information about student learning from electronic portfolios, classroom embedded assignments, field tests, and data on faculty and first-year students derived from YSU’s participation in Pennsylvania State University’s “Parsing the First Year of College” project—a three-year study funded by the Spencer Foundation that included 35 institutions who researched the influences affecting student learning and persistence of new first-year students.
Dr. Sharon Stringer, Director of Assessment and Professor of Psychology at YSU, continues to collaborate with other units on campus to drill down on specific NSSE items that are part of the VSA template. They examine these data in relation to GPA, success, and progress rates, to determine whether there are patterns of performance among sub-populations of students (e.g., nontraditional students, diversity subgroups, transfer students). This process will inform future decisions about the selection of assessment tools such as the Collegiate Learning Assessment (CLA) that provide direct measures. Stringer is using recommendations from Assessment matters: The why and how of cracking open and using assessment results (Ahren, Ryan, & Massa-McKinley, 2008) as a planning guide to deeper analyses of the data and pacing of assessment tests and surveys over the next four years. YSU has also collected internal survey data on general education over the past ten years and plans to examine these data in relation to NSSE and to direct measures of student learning.
Standing alone, NSSE only supplies indirect measures of student learning. The campus community and constituencies recognize that NSSE data are insufficient in themselves to make substantial changes in programs or policies. For preparation of its self-study for the Higher Learning Commission (HLC), YSU used NSSE results, in-house questionnaires, and data on retention and diversity. Stringer considered all of these data resources to be very valuable in the design of YSU’s new 2007-2013 Academic Strategic Plan, which emphasizes teaching, learning, and student engagement. The campus is dedicated to helping students integrate their curricular and co-curricular experiences. Future review of NSSE data will be used to enhance YSU’s participation in Campus Compact, a national initiative that promotes community service, civic engagement, and service-learning in higher education. Although YSU joined Campus Compact in 2008, Stringer hopes to use NSSE results and other resources to assess the impact of service-learning experiences on students.
YSU has formed an Assessment Council with 14-16 members. The Council was established through the Provost’s Office and includes faculty, staff (including Institutional Research & Policy Analysis, Student Affairs, and representatives from each college), and students. All members of the Council received a copy of the actual NSSE report (including raw data). The report was read by all members and discussed in Council meetings. The General Education Committee also participates in the Assessment Council and considers NSSE results to refine the general education goals. After careful review of the data by the Council, Stringer makes presentations to numerous campus constituents such as the President’s Cabinet, Student Life, Student Government Association, academic advisors, and others. Currently at YSU, the Assessment Council, General Education Committee, and Institutional Research play vital roles in reviewing and interpreting NSSE data. For the future, YSU plans to implement a Council on Teaching and Learning that will include campus-wide representation—including academic affairs, student affairs, and advising staff—to discuss data on student learning.
Evidence-Based Improvement in Higher Education resources and social media channels