A key benefit of your NSSE Institutional Report is the ability to customize your comparison groups. Using NSSE’s online Report Form, you can identify the most relevant, appropriate institutions from the available pool of current-and prior-year NSSE participants. We encourage you to tailor your comparison groups. This document reviews options, considerations, and tips to help you make your selections.
Comparison Groups
Customization Options
To customize your comparison groups, you have four options: (a) Identify individual institutions from the list of NSSE participants, (b) Select by institutional characteristics, (c) Begin by selecting institutional characteristics, and then add or remove individual institutions to refine the group, or (d) Accept NSSE’s default groups (below).
- Default Group 1 – For institutions not in
a NSSE consortium, this group includes all other prior- and current-year NSSE institutions within your geographic region (New England, Mid East, Great Lakes, Plains, Southeast, Southwest, Rocky Mountains, Far West, or Canada) and sector (public or private). For consortium-participating institutions, this group includes the other consortium members. - Default Group 2 – All other prior- and current-year NSSE institutions sharing your institution’s Basic Carnegie Classification (Doc RU-VH, Doc RU-H, Doc DRU, Masters-L, Masters-M, Masters-S, Bac- AS, or Bac-Diverse).
- Default Group 3 – For U.S. institutions this includes all other prior- and current-year U.S. NSSE institutions. Canadian universities receive all other prior- and current-year Canadian and U.S. institutions.
- Module Defaults – If you participated in one of NSSE’s topical modules your default group includes all other prior- and current-year institutions that administered the same module.
Again, we do not encourage the acceptance of default groups and urge you to create tailored comparison groups.
Select from Current- and Prior-Year Participants
To improve alignment with institutionally designated peer groups, the pool of available comparison institutions for the core survey reports includes all NSSE-participating institutions in the current and prior years. Consortium and module comparison groups also include prior-year participants as long as the questions did not change.
Approaches to Building Comparison Groups
A variety of goals drive comparison group selection. Four common approaches to building comparison groups include:
- Peer groups – The most common approach is to identify a group of institutions similar to your own, based on characteristics such as Carnegie classification, enrollment size, type of educational offerings, and other defining criteria.
- Aspirational groups – Institutions may assess themselves relative to colleges and universities they view as exemplars on important dimensions.
- Overlap groups – This comparison is with institutions that overlap in the available array of students, faculty, or resources. For example, a college may be interested in how it compares with those that recruit from the same pool of prospective students.
- Pre-existing groups – Institutions may want to be compared with members of a pre-existing group, especially those sharing a common mission or goals. Examples include special missions (e.g., religious affiliation, HBCU), university systems, consortia, athletic conferences, and so on.
Comparison Group Examples
Beyond traditional member groups, we encourage you to think creatively and further customize comparison groups to your institution. Below are a few examples of comparison groups created by NSSE participating institutions:
- Military Friendly – This comparison group includes schools that are considered Military Friendly and are "Student Veteran Rated" (https://www.militaryfriendly.com/schools/).
- 40-60% Pell – Institutions in this group were identified as peer institutions with respect to the percentage of students who are Pell grant recipients.
- Sweet 16 & Carnegie – Colleges and universities with the same Basic Carnegie Classification (Bac/Diverse: Baccalaureate Colleges—Diverse Fields) in 16 states from which the institution recruits.
- Cross-Application – This group included institutions where students cross-apply for admission.
- TOP US News Liberal Arts – This group was composed of institutions named among the Top 50 US News National Liberal Arts Colleges.
Things You Can Do
Below are a few steps to take to spark campus conversation and get others involved in comparison group selection:
- Locate a copy of your institution’s current peer groups and aspirational peer groups. (This list may be maintained by the office of the President or Provost, or by the assessment, planning, or institutional research.)
- Examine your campus strategic plan for institutional initiatives that would be informed by NSSE data, and discuss comparison group selection with the strategic planning committee or those who oversee the institutional
- Invite academic deans or department chairs to suggest meaningful comparison groups. (Also mention the Major Field Report.)
If you have any questions or want to share creative ideas with us, please call or email your NSSE Project Services team.
Other Factors to Consider
- Keep it simple – We offer a wide variety of criteria for use in selecting comparison groups. Selecting one or two dimensions such as sector (public or private), size, region, or institutional type is often better than basing group selection on too many criteria. Keeping selection criteria simple may ease comprehension of the group and interpretation of results.
Comparison group size – We recommend that you consider each comparison group’s size. Groups with fewer institutions may offer more specific criteria for comparability, while larger groups may be more stable, especially across multiple NSSE administrations. Thus, a mix of both small and large groups may be most beneficial.- Involve stakeholders – You may want to solicit input from various campus stakeholders regarding the selection of comparison groups. Involving administrators, faculty, and others in evaluating peer comparison results will improve the utility and impact of your NSSE reports.
- Comparison group stability – While we encourage you to periodically evaluate your comparison groups, using similar comparison groups over time will be valuable in assessing change. In most cases, using as many of the same comparison group institutions as possible across consecutive NSSE administrations can be most useful. The ability to include prior-year participants offers another way to enhance comparison group stability.
How Institutions Typically Select Comparison Groups
In recent years, the vast majority of NSSE participating institutions customize at least one of their comparison groups, rather than simply accepting the default groups. Of these customizers, we observed the following patterns:
- Institutions that customized a comparison group most often did so by selecting individual institutions.
- The average size of comparison groups selected individually was 15 institutions. To ensure the confidentiality of each institution’s results, we require that each comparison group contain at least five institutions. There is no upper limit on comparison group size.
- The most common institutional characteristics used to build comparison groups were Carnegie Classification and sector (public/private).
- Where institutional characteristics were used, institutions elected to keep it simple.
- Several topical module participants customized their module comparison group.
- The most popular default comparison groups were the Basic Carnegie Classification and all prior- and current-year NSSE institutions. By contrast, very few non-consortium participants customized the first group rather than accepting the default (region and sector).
- Very few Campus Project Managers did not access the Report Form. As a result, they received the default comparison groups.
Summary
Identifying comparison groups can be complicated, but relevant, appropriate comparison groups are a vital component of your NSSE reports. The ability to customize these groups to suit your analytic needs is an important way to ensure the usefulness of your results.