Are students interpreting survey items consistently and in ways that accurately represent the behaviors or perceptions NSSE intends to measure?
Purpose
Cognitive interviewing and focus groups provide information about the processes respondents use to answer survey questions, identify potential problems that might lead to survey response error, and assist researchers in gaining a better sense of how survey respondents perceive items. These techniques provide insight into the ways the cognitive tasks posed by a question are handled by respondents (i.e., comprehension of the question, information retrieval) (Drennan, 2003). Ideally, cognitive interviews and focus groups would reveal that NSSE respondents with different interests, academic abilities, programs of study, etc., make similar meaning of survey items, thus allowing researchers to attribute differences in student responses to actual behaviors and perceptions—not because they understand questions differently.
In-depth cognitive research testing has been conducted several times since NSSE’s inception in 1999 (Kuh, Kinzie, Cruce, Shoup & Gonyea, 2006; Ouimet, Bunnage, Carini, Kuh, & Kennedy, 2004). NSSE conducted its most recent series of cognitive tests, which included cognitive interviews and focus groups, as part of the four-year development process that resulted in the updated NSSE instrument that participating institutions started using in 2013. This report summarizes the data, methods, and results from this most recent effort.
Data
Between fall 2010 and spring 2012, NSSE staff conducted cognitive interviews with 136 students at 12 institutions across the Midwest. An additional 79 students at these institutions participated in one of ten focus groups. Data collection occurred prior to the administration of pilot surveys in 2011 and 2012 as well as the standard 2013 administration. Participating institutions varied by enrollment size, public-private status, religious affiliation, and Carnegie classification.
Methods
We used a variety of interview techniques with students (see Appendix A for sample script). During cognitive interviews, we asked students to answer specific questions from the survey and then to describe how they interpreted the question. We also used concurrent probing that involved the interviewer asking a question, having the student answer, and then having the interviewer ask more specific scripted or spontaneous questions designed to elicit further information about the response. Additionally, retrospective probing at the end of the interview asked subjects to verbalize their thoughts about questions they answered earlier. In addition to concurrent and retrospective probing, NSSE staff asked students to “think-aloud” while answering some questions. Think-aloud refers to an explicit activity in which the subject is encouraged to verbalize their thought processes as they answer survey questions. Starting in 2012, cognitive interview subjects completed the survey online while seated next to the interviewer. As with earlier interviews, NSSE staff asked subjects about their opinions related to new or modified survey questions using probing and talk-aloud techniques, and encouraged them to highlight any confusing questions. The last round of interviews included distance learners (completing coursework entirely online), interviewed by phone using a similar protocol to the standard face-to-face interviews. We encouraged these online students to express any concerns they may have had with survey questions given their online college experience.
Focus group participants received paper copies of the survey to read and answered specific questions. Interviewers used concurrent probing to ask specific follow-up questions about specific survey items, especially items related to questions identified as challenging from previous campus cognitive interviews.
Each cognitive interview and focus group had an interviewer and note taker present. Research teams created analytic memos based on both notes and digital recordings. These memos contained a summary of each student’s response to specific items and interviewer impressions of survey item quality. Two other independent reviewers then coded survey items as having: (a) no problems; (b) minor problems; or (c) significant problems. A third reviewer addressed any initial coding discrepancies. A summary of the different ways students interpreted questions was also created to inform the larger survey revision process. Finally, the team of researchers who conducted the cognitive testing jointly analyzed findings and thoroughly vetted interpretations and conclusions. These findings then influenced final survey item wording decisions throughout the survey development process.
The cognitive testing process placed primary attention on items that had never been administered by NSSE before (e.g., quantitative reasoning, effective teaching practices), or older items that were significantly revised (e.g., higher-order learning, quality of interactions).
Results
Results from the cognitive interviews and focus groups demonstrated that students generally interpreted survey items as intended by NSSE staff and largely confirmed past evaluations of the survey (Kuh et al., 2006; Ouimet et al., 2004). This was true even for more complex items that involved calculating the time, frequency, or number of occurrences of various activities. Students employed cognitive strategies such as thinking about their course syllabi to count the number of papers they had written or calculating the average number of papers per class and generalizing from that figure to all of their classes. Students were also able to easily recall how often they did common activities like asking questions in class or working with other students on a course project. Cognitive testing revealed however that survey error is possibly being introduced in the following ways for some items: (1) language problems – not knowing the meaning of words or phrases; (2) inclusion/exclusion problems – determining whether certain concepts are to be considered within the scope of an item (e.g., does personal reading online apply to work completed for classes); (3) temporal problems involving the time period in which a question applies; (4) logic problems – about how students interpret phrases like ‘and’ and ‘or’ in survey questions; and (5) computational problems – difficult mental arithmetic (Conrad & Black, 1996; Willis, 2005). Some specific exemplar comments, organized by Engagement Indicators (current NSSE scales) and/or item types, are presented below. Survey items are referred to by their variable names for the remainder of this report (see Appendix B for specific item wording).
One student admitted to not completely reading the stem of the questions. Some students had a hard time figuring out how to classify infrequent activities. Many of the online students mentioned that the survey seemed like it was written for traditional college students, not online students. Despite this perception, online students were still able to answer most of the questions easily.
Students had some difficulty distinguishing between the different items in this indicator. HOapply seemed to be more difficult for students to answer and required some time to think about. A few students were stuck on the word "theories" and did not feel like it pertained to them because their major did not focus on theory. Other students only considered when they were applying theories to real world and contemporary problems (e.g., not counting looking at different historical events using a particular theory).
No significant problems with RIsocietal were identified for on-campus or online students. A few students thought a parenthetical with examples of societal problems or issues would be helpful.
RIdiverse was challenging for some students and students asked for clarification on what we were asking about.
A small number of students expressed uncertainty about LSsummary and were not sure if the question was asking about if they actually wrote out a summary or if they just thought about summarizing course material.
Overall, these items were more challenging for students in non-STEM majors. Students seemed to spend more time thinking about these questions but were generally able to answer them. Interviews with online students had similar results. For item QRproblem, there may be a problem with underestimating frequency because some students stated that they would answer “never” because they initially just thought of math classes. When prompted to think about other classes, they were able to come up with examples where they had done this. In general, the parentheticals seemed helpful. Students found QRevaluate slightly confusing and their interpretation varied more. Students were not sure who "others" were (e.g., other students or just researchers). They also were not sure what constituted "evaluating." Some counted any time they read something with numerical information on it, thought about it, or were tested on it. Other students only counted times that they actually challenged assumptions in the research or how the numerical information was calculated.
Students generally understood the questions about writing (wrshort, wrmed, and wrlong). A number of students mentioned that it was easier to accurately report the number of larger papers, and harder to remember all of the short papers. Students also differed on what they counted as a "paper" in the "less than 5" pages category. Some students were counting their daily reflections as less than 1 page assignments and some student did not count these as papers. Students also were not sure if they should count assignments like lab reports or just formal papers. A couple of students noted that they were not sure if they should count all the papers they would write or just the ones they had actually written up to that point in time (e.g., should they include their senior thesis in the paper count). Interviews with online students showed similar results.
A version of tmread was tested asking students about how many pages they read instead of how much time they spend reading. When asked about the number of pages read, students spent a significant amount of time thinking about the question and adding up reading assignments from each class. It was difficult for students to report the amount of pages actually read instead of the number of pages assigned. Students also reported the number of pages they read in books or printed articles but were not counting reading on an e-reader or online. One student admitted that she would skip this question and a few other students just picked a response based on the scale provided instead of really thinking about the number of pages they read. For online students, there were similar problems with this question but this population had more extreme under-reporting of online readings and e-book readings since a larger portion of their reading was not from printed material. Based on these responses, this question was changed for the updated survey to focus on how much time students spend reading to make the question easier to answer and to include readings that did not have page numbers (e.g., online readings).
These items were not specifically tested during every round of interviews, but no problems with answering items emerged when students were completing the survey.
Students had significant problems with the original wording of these questions (DDrace, DDeconomic, DDreligion, DDpolitical). With the original wording of the question, rather than answering the question thinking about having serious conversations with people that are different from them, several students answered the question only thinking about having serious conversations about that topic. Students seem most likely to think of talking to someone who is different from them when answering the question about race, so that item was moved up to appear first in the list. A number of students went back and forth between answering the question from these two different perspectives. To further clarify that the question is about interacting with people who are different from yourself, the items were changed to specifically reference “people” instead of discussion topics (e.g., “People of a different race…).
Other issues appeared with both versions of these items. The question takes a longer time to think about and answer, and some students were confused about what counts as a "serious conversation" and varied in what they considered a serious conversation. Some students only counted heated conversations and arguments as "serious conversations" while other students thought anything beyond superficial chit-chat counted. The question wording was changed to “discussions” instead of “serious conversations” to address this issue.
Students also differed widely in how different was “different,” and this seemed to be related to the type of institution they attended. In general, students at larger schools had a higher standard for identifying differences whereas some students from small, more homogeneous institutions had a lower
standard. For example, a number of students from a small religious school with very little diversity still answered that they often interacted with people who had different religious beliefs because, even though everyone at the school was an Evangelical Christian, they all still had slightly different religious beliefs. At other schools students thought of someone from another religion when answering this question. At the same small religious school, they also counted very small differences in economic and social background as being different (e.g. having a stay-at-home parent or not) where students at other schools only counted more obvious differences (e.g. white collar vs. blue collar).
Similar issues appeared with online students, but these students struggled more with knowing whether or not their classmates were different from them. Students only knew details about their classmates from the bios posted at the beginning of class, so they had limited knowledge about their classmates. They also felt like it was more challenging to have serious conversations since they were just talking through posts on a discussion board.
No significant problems were found with SFotherwork. One student answered the question thinking about getting help on coursework, but this seemed like an error from reading too quickly. Another student was not sure how to account for interactions with his faculty advisor on this question. Online students did not feel like these questions applied to them. They all mentioned that they were not supposed to contact their instructors outside of class or about non-course related issues.
One online student was confused by SFdiscuss. They were not sure what counts as “in class” or “out of class” for online classes. Is talking to people in the class chat room beyond what is required for class, considered discussing course topics outside of class? Or talking to a teacher over email or chat? The student stated that they felt like it was the same as going to office hours, but was not sure if that was what the question was asking.
Students did not have problems responding to SFperform. When answering this item, most students answered thinking about grades. One student included getting her grades on Blackboard as discussing academic performance. Some students saw this as something negative and something they would only do if they were concerned about their grades.
A number of students mentioned that it was hard to think of all their classes at once, but they generally understood and were able to answer these questions. Online students had similar responses to other students.
A few students wrote in comments on the pilot about item QIadvisor being difficult to answer because they had multiple advisors. These students seemed to pick a response option that represented experiences with both advisors, or averaging between the two.
Students generally understood item QIstaff. Most students only thought of the people listed in the parenthetical. Some students also thought of dining commons staff or the custodial staff. A couple students considered student workers when answering this question. A couple of students reported on the frequency of the interaction, rather than the quality. When prompted, to think of quality not quantity, they changed their responses. For online students, this question was more difficult. Most of the online students thought of people in the parenthetical and selected “not applicable.” One student had interacted with career services and answered the question based on those interactions. Some online students were not sure who we were asking about and answered this based on their interactions with their academic advisor or financial aid counselor. Those two students thought QIstaff and QIadmin were redundant and did not understand the difference.
Students generally understood item QIadmin, and most students answered thinking of the offices listed in the parenthetical. A few students still answered on quantity of interactions, rather than quality. Online students thought of who was in the parenthetical and were able to answer the question based on interactions with financial aid office and the registrar's office.
Students generally understood these questions. A few online students felt that some of these items were less applicable to them. For example, an online student thought that SEsocial was referring to getting together with people but was not sure if they should count connecting with other students on Facebook or through other social media. A few online students mentioned that SEactivites and SEevents were also less applicable to their experiences.
Students at schools where service-learning or community projects are often required in classes, such as small, private, and religious institutions, had no problems with servcourse. Students at other schools had a harder time understanding the item and answered with more variation. Some students were not sure what "community based project" meant or what “service-learning” meant. Some students also counted any community service they did regardless of whether it was tied to a class or not. None of the online students had done service learning and felt like it would be less common at an online school. One student did question whether this would mean doing any sort of community service or just as part of a class. Students generally understood the other high-impact practice questions.
Generally speaking, students disclosed no problem understanding what was intended by the various perceived gains items, with the exception of one item. On the first and second pilot instruments, one item referred to “acquiring a broad general education.” Interview results suggested that students had problems with understanding what exactly a broad general education includes, so the item was eventually dropped. A few students brought up that they answered “very little” on gains items because they already were strong in those areas so they did not change a lot during college. Other students marked “very little” because they felt that they grew in those areas, but not because of their institution.
One online student was confused by item attendart because they were not sure if the question was only asking about an exhibit or performance at the school or just any exhibit or performance. This student answered the question as “never” because their school did not have their own arts performances.
A few students mentioned that item tmcare was difficult to answer because you cannot put a number of hours on being a parent, “you're always a parent.”
With the item present, some online students were not sure how to answer how many presentations they gave. Students mentioned that they turned in PowerPoints as a "presentation" but have never actually given a presentation.
One student was confused by item tmprep and did not know what to count as preparing for class, wondering if it was just preparing right before class, or doing homework, too.
Students were generally able to report how many courses they took in item coursenum. A few students were not sure if they should count non-academic courses (yoga), or independent study classes. A few students also were not sure if academic term meant the semester or the academic year, but all decided on semester. All of the students, including online students, were able to easily answer the number of courses they took online, using item onlinenum.
Students did not have difficulty reporting their educational aspirations in item edaspire. One student was not sure if it meant at the current institution or ever.
A student who used to be an athlete at the school had difficulty with the item athlete. He was not sure if he should answer yes or no because he used to be on an athletic team at his institution but was not currently, and the question does not specify if it means “currently” or have you “ever been.”
References
Conrad, F., & Blair, J. (1996). From impressions to data: Increasing the objectivity of cognitive interviews. In JSM Proceedings, Survey Research Methods Section (pp. 1–9). Alexandria, VA: American Statistical Association. Retrieved from http://www.amstat.org/sections/srms/Proceedings/
Drennan, J. (2003). Cognitive interviewing: Verbal data in the design and pre-testing of questionnaires. Journal of Advanced Nursing, 42(1), 57-63.
Kuh, G. D., Kinzie, J., Cruce, T., Shoup, R., & Gonyea, R. M. (2006). Connecting the dots: Multi-faceted analyses of the relationships between student engagement results from the NSSE, and the institutional practices and conditions that foster student success. Final report prepared for Lumina Foundation for Education. Bloomington, IN: Indiana University, Center for Postsecondary Research.
Ouimet, J. A., Bunnage, J. B., Carini, R. M., Kuh, G. D., & Kennedy, J. (2004). Using focus groups to establish the validity and reliability of a college student survey. Research in Higher Education, 45, 233-50.
Willis, G. B. (2005). Cognitive interviewing: A tool for improving questionnaire design. Thousand Oaks, CA: Sage.
Appendix A
Sample of Script for Cognitive Interviews NSSE 2.0 Pilot 2012
Introduction [Introduce yourself and the note taker, etc. Then read the following:] Thank you for taking time to help us develop our survey questionnaire. Just to provide you with a little background, this survey is given to first-year and senior students to better understand their undergraduate experience. We are currently in the midst of revising portions of the survey instrument and want to be sure that these particular items make sense to students.
[Give the Consent Form to student and explain the study] Please take a moment to look over the informed consent information I have given you. You can take it with you. Do you have any questions about the study? [Pause for questions]
Continuing with the study means that you have received this informed consent information and had the opportunity to review it, and understand that your participation is voluntary and you can stop at any time.
The interview will take approximately 45 minutes to complete.
Please begin taking the survey and I will interrupt you to request that you do the “think aloud” or to ask you about your response. To “think aloud” is to verbalize your thought processes as to how you went about figuring out what the question is asking and deciding upon your response. To be sure you understand what I mean, we’d like to demonstrate.
Interviewer: I am going to ask you to answer the following question: “How many soft drinks, if any, did you drink last week?” To “think aloud” about this question, please verbalize your thought process as you try to understand what the question is asking, and how you decide on your answer to the question. [If necessary: I know it might feel awkward at first, but just try to talk about what’s going on inside your mind as you go about answering the question…) Respondent: Well …
For the majority of survey items we will ask about the meaning of the survey item and ask follow-up questions related to each item.
Do you have any questions at this time? [Have the respondent sit at the computer with the questionnaire]
SCRIPT A
Prompt Indicator
Question Number
Question
2.
In your experience at your institution during the current school year, about how often have you done each of the following?
Never, Sometimes, Often, Very Often
a.
Connected your learning to societal problems or issues
PROMPT
How did you arrive at this answer? What is an example of connecting your learning to societal problems or issues?
3.
In your experience at your institution during the current school year, about how often have you done each of the following?
Never, Sometimes, Often, Very Often
c.
Discussed ideas for a course project or paper with a faculty member
PROMPT
How did you arrive at this answer? What is an example of when you discussed ideas with a faculty member?
e.
Asked a faculty member for guidance on your academic program or plans
PROMPT
How did you arrive at this answer? What is an example of when you asked a faculty member for guidance?
f.
Discussed your academic performance with a faculty member
PROMPT
How did you arrive at this answer? What is an example of when you discussed your academic performance with a faculty member?
Think Aloud
4.
During the current school year, in about how many of your courses have your instructors done the following?:
None, Some, Most, All
d.
Taught in ways that encouraged your active participation
e.
Created an atmosphere to promote your learning
f.
Got to know you as an individual
g.
Provided feedback on a draft or work in progress
PROMPT
Was it difficult to answer any part of this question? Was it difficult to think about how many of your courses the instructor had done these things?
General Concluding Questions
Are there any questions that you found difficult to answer?
Did you find any response sets hard to use or confusing?
Was there anything that you expected us to ask you about that’s not on the survey?
What should we add? Or change on the survey?
Is there anything that we failed to ask you about on this survey – anything that you see as very important to your learning and your ability to stay in school?
Thank you for participating in this discussion. Your responses are very helpful to us. Give students gift cards. Sign acknowledgement forms
Sample of Script for Focus Groups NSSE 2.0 Pilot 2012
Questions for Focus Groups
[As a guide to what questions to ask, review items that seem to be problematic or generate different responses from the cognitive interviews.]
Let’s look at 2c. What does it mean to you to have “explained course material to one or more students”?
Let’s look at 2d. What does it mean to you to have “prepared for exams by discussing or working through course material with other students”?
Let’s look at 2e. What does it mean to you to have “received feedback from other students about course assignments before turning them in”?
Do you think of study group experiences when answering these items?
Appendix B
Complete Wording for Variables Referenced in Report Text
Variable Name
Survey Text
HOapply
Applying facts, theories, or methods to practical problems or new situations
RIsocietal
Connected your learning to societal problems or issues
RIdiverse
Included diverse perspectives (political, religious, racial/ethnic, gender, etc.) in course discussions or assignments
LSsummary
Summarized what you learned in class or from course materials
QRproblem
Used numerical information to examine a real-world problem or issue (unemployment, climate change, disease prevention, etc.)
QRevaluate
Evaluated what others have concluded from numerical information
wrshort
How many times have you written a paper, report, or other assignment that was of the following length: Up to 5 pages
wrmed
How many times have you written a paper, report, or other assignment that was of the following length: Between 6 and 10 pages
wrlong
How many times have you written a paper, report, or other assignment that was of the following length: 11 pages or more
tmread v1
In a typical week this year, about how many total pages have you read for all of your courses?
tmread v2
Of the time you spend preparing for class in a typical 7-day week, about how many hours are on assigned reading?
DDrace v1
How often have you had serious conversations with people who differ from you in the following ways: Race, ethnic background, or country of origin
DDrace v2
How often have you had serious conversations with people from the following groups: People of a race or ethnicity other than your own
DDeconomic v1
How often have you had serious conversations with people who differ from you in the following ways: Economic and social background
DDeconomic v2
How often have you had serious conversations with people from the following groups: People from an economic background other than your own
DDreligion v1
How often have you had serious conversations with people who differ from you in the following ways: Religious beliefs or philosophy of life
DDreligion v2
How often have you had serious conversations with people from the following groups: People with religious beliefs other than your own
DDpolitical v1
How often have you had serious conversations with people who differ from you in the following ways: Political views
DDpolitical v2
How often have you had serious conversations with people from the following groups: People with political views other than your own
SFotherwork
Worked with a faculty member on activities other than coursework (committees, student groups, etc.)
SFdiscuss
Discussed course topics, ideas, or concepts with a faculty member outside of class
SFperform
Discussed your academic performance with a faculty member
QIadvisor
Indicate the quality of your interactions with the following people at your institution: Academic advisors
QIstaff
Indicate the quality of your interactions with the following people at your institution: Student services staff (campus activities, housing, career services, etc.)
QIadmin
Indicate the quality of your interactions with the following people at your institution: Other administrative staff and offices (registrar, financial aid, etc.)
SEsocial
Providing opportunities to be involved socially
SEactivites
Attending campus events and activities (special speakers, cultural performances, athletic events, etc.)
SEevents
Attending events that address important social, economic, or political issues
servcourse
Participated in a community-based project as part of a regular course (i.e., service-learning)
attendart
Attended an art exhibit, play or other arts performance (dance, music, etc.)
tmcare
Providing care for dependents (children, parents, etc.)
present
Gave a course presentation
tmprep
Preparing for class (studying, reading, writing, doing homework or lab work, analyzing data, rehearsing, and other academic activities)
coursenum
How many courses are you taking this current academic term?
onlinenum
Of these, how many are entirely online?
edaspire
What is the highest level of education you expect to complete?
athlete
Are you a student-athlete on a team sponsored by your institution’s athletics department?
Evidence-Based Improvement in Higher Education resources and social media channels