National Survey of Student Engagement
2005 Annual Report

Students checking grades Girl studying  

Construction of the 2005 NSSE Benchmarks

The groups of items that go into the construction of the benchmarks were created with a blend of theory and empirical analysis. Initially, we conducted principal components analyses with oblique rotations. Then theory was employed to crystallize the item groupings into the respective groups. As in the past, only randomly sampled cases were included in the calculation of institutional benchmarks.

The process for calculating benchmark scores was revised in 2004 to make them easier to understand and to allow for institutions to calculate their own scores and run intra-institutional comparisons.

The construction of the 2005 NSSE Benchmarks has four steps. First, all items that contribute to a benchmark are converted to a 0 - 100 point scale. For the 'enriching' items (question 7 on the survey), those students who indicated that they had already "done" the activity receive a score of 100, while those students who "plan to do," "do not plan to do," or who "have not decided" to do the activity receive a 0. Other items are converted as would be expected. For example, items with four response options (e.g., never, sometimes, often, very often) are recoded with values of 0, 33.33, 66.67, or 100.

Second, part-time students' scores were adjusted on four Level of Academic Challenge items (READASGN, WRITEMID, WRITESML, ACADPR01). For each item, a ratio was calculated by dividing the national average for full-time students by the national average for part-time students. Each part-time student's score on an item was multiplied by the corresponding ratio to get their adjusted score. Adjusted scores were limited so as not to exceed 100.

Third, student-level scale scores were created for each group of items by taking the mean of each student's scores. A mean was calculated for each student so long as they had answered three-fifths of the items in any particular group.

Finally, institutional benchmarks were created by calculating weighted averages of the student-level scale scores for each class (first-year students and seniors).

Using base random sample from the 2005 NSSE survey administration, we examined the internal consistency of each NSSE benchmark using Cronbach's Alpha. The result is shown in table below:

Internal Consistency of NSSE Benchmarks (Cronbach's Alpha)
NSSE BenchmarksFirst YearSeniorFirst Year/Senior
Academic Challenge0.740.760.75
Active and Collaborative Learning0.640.650.67
Student-Faculty Interaction0.720.750.75
Enriching Educational Experiences0.540.640.66
Supportive Campus Environment0.780.780.77

Click here for more information about the weights.

Click here to download the SPSS syntax used to construct the benchmarks.

Click here to view student responses to each benchmark.

Contact NSSE | NSSE Home

© 2005 Indiana University Center for Postsecondary Research

Report Overview Benchmarks Survey Instrument Profile of Participants Student Responses Research Outreach Order the 2005 Report