Search again
Publications
Delivering on the Promise of High-Impact Practices: Research and Models for Achieving Equity, Fidelity, Impact, and Scale
Zilvinskis, J., Kinzie, J., Daday, J., O??Donnell, K & Vande Zande C.
Stylus, 2022.
Research shows that enriching learning experiences such as learning communities, service-learning, undergraduate research, internships, and senior culminating experiences ?? collectively known as High-Impact Practices (HIPs) ?? are positively associated with student engagement; deep, and integrated learning; and personal and educational gains for all students ?? particularly for historically underserved students, including first-generation students and racially minoritized populations. Delivering on the Promise of High-Impact Practices is to provide examples from around the country of the ways educators are advancing equity, promoting fidelity, achieving scale, and strengthening assessment of their own local high-impact practices. Its chapters bring together the best current scholarship, methodologies, and evidence-based practices within the HIPs field, illustrating new approaches to faculty professional development, culture and coalition building, research and assessment, and continuous improvement that help institutions understand and extend practices with a demonstrated high impact.
Full version
Centering Black women faculty: Magnifying powerful voices
Priddie, C., Palmer, D., Silberstein, S., & BrckaLorenz, A.
To Improve the Academy, 41(2), 96-127, 2022.
While much of the quantitative research on Black women faculty has taken a comparative approach to understanding their experiences, this study provides a counternarrative, centering their experiences as faculty. This large-scale, multi-institution glance at Black women faculty helps to give us an overview of these women across the country, looking at who they are, where they are, how they spend their time, and what they value in undergraduate education. This study allows us to strengthen various arguments made in qualitative studies of Black women faculty and amplify their perspectives and experiences. Furthermore, it reaffirms and reinvigorates the need for educational developers to practice intentional assessment of Black women faculty??s teaching, support the current teaching efforts of Black women faculty on their campus, and advocate for policy change centering the work of Black women faculty.
Full version
Delivering on the Promise of High-Impact Practices: A New Resource for Assessment
Zilvinskis, J., Kinzie, J., Daday, J., O??Donnell, K & Vande Zande C.
Assessment Update, 34(4), 1-2, 16, 2022.
Elevating Student Voice in Assessment: Approaches to Using NSSE's Student Comments
Kinzie, Jillian; Silberstein, Samantha; Palmer, Dajanae
Assessment Update, 33(2), , 2021.
Full version
Assessing intersectional experiences: Where to begin?
BrckaLorenz, A., & Kirnbauer, T.
Assessment in Practice, , , 2021.
Full version
How to reorient assessment and accreditation in the time of COVID-19 disruption
Kinzie, J.
Assessment Update, 32(4), 4-5, 2020.
Among the many issues facing higher education during COVID-19 is uncertainty about the status of student learning outcomes assessment and accreditation. Will necessary shifts in course assignments and assessments affect completion, particularly for those scheduled to graduate this year? Will a suspension (or slowdown) of program-level assessment put the institution out of compliance with state regulations or accreditation requirements? If accreditation visits are postponed, will the institution find its federal funding in jeopardy? All of these concerns are understandable, and it is good to have them aired and discussed. However, the disruptions caused by COVID-19 may also provide an occasion for some useful rethinking of assessment. What those disruptions underscore is that decisions about assessment and accreditation must, above all, be sensitive to current realities and do what is best for students and faculty. Rather than aiming for compliance, or sticking with the plan to ?just give students the exam and asterisk the results,? now is the time to prioritize what people need and embrace compassion-driven assessment, and reassess the fundamental goals of assessment. To help think about the issues at hand, I offer some practical suggestions for course- and program-level assessment and accreditation demands. Then I suggest we take advantage of this moment to make some meaningful improvements to assessment and accreditation.
Full version
Information Literacy's Influence on Undergraduates' Learning and Development: Results from a Large Multi-institutional Study
Fosnacht, Kevin
College & Research Libraries, 81(2), 272-287, 2020.
This paper investigated the reliability and validity of the National Survey of Student Engagement's Experiences with Information Literacy module, an assessment instrument developed in collaboration with a group of instructional librarians. After identifying three information literacy?related factors in the module, it assessed the relationship between the factors and students' engagement in Higher-Order Learning and Reflective and Integrative Learning activities and students' perceived gains. The results from these analyses indicated that information literacy activities were positively and significantly correlated with student engagement and students' perceived gains.
Full version
NSSE's Quest for Quality: Seven Lessons in Designing a Better Survey
Gonyea, Robert M. and Sarraf, Shimon
Assessment UpdateAssessment Update, 32(2), 6-14, 2020.
In 20 years NSSE has learned much about how to manage a successful survey administration, and how to continuously improve the questionnaire, data collection tools, and reporting. This article includes seven lessons that may help assessment professionals improve their own surveys.
Full version
The National Survey of Student Engagement (NSSE) at Twenty
Ewell, P. and McCormick, A.C.
Assessment UpdateAssessment Update, 32(2), 1-16, 2020.
Full version
Twenty Years of NSSE Data Use: Assessment Lessons for the Collective Good
Kinzie, J. and Franklin, K.
Assessment UpdateAssessment Update, 32(2), 4-15, 2020.
Full version
Examining the Meaning of Vague Quantifiers in Higher Education: How Often is "Often"?
Rocconi, L.M., Dumford, A.D., & Butler, B.
Research in Higher Education, 61, 229-247, 2020.
Researchers, assessment professionals, and faculty in higher education increasingly depend on survey data from students to make pivotal curricular and programmatic decisions. The surveys collecting these data often require students to judge frequency (e.g., how often), quantity (e.g., how much), or intensity (e.g., how strongly). The response options given for these questions are usually vague and include responses such as "never," "sometimes," and "often." However, the meaning that respondents give to these vague responses may vary. This study aims to determine the efficacy of using vague quantifiers in survey research. More specifically, the purpose of this study is to explore the meaning that respondents ascribe to vague response options and whether or not those meanings vary by student characteristics. Results from this study indicate a high degree of correspondence between vague and numeric response and suggest that students seem to adapt the meaning of "sometimes," "often," and "very often" based on the appropriate reference for the question. Overall, findings provide evidence of the utility and appropriateness of using vague response options. Some differences by student characteristics and the implications of these differences are discussed.
Full version
Behavior-based student typology: A view from student transition from high school to college
Mu, L., & Cole, J.
Research in Higher Education, , , 2019.
Several recent studies have successfully identified college student typologies based on individuals‘ behaviors. One limitation of past studies has been their reliance on one-time cross-sectional assessments. As a result, we are left to ponder the stability of students‘ behavioral types as their academic years move forward. This study used longitudinal student data from high school to college, to investigate the stability of a behavior-based student typology. Guided by findings in behavioral consistency from personality psychology, this study explored the associations of higher education institution‘s structure, and supportive elements of the environment and the transition of students‘ behavior-based types. The results showed that, in high school and higher education settings, students‘ behaviors in a variety of activities classified students into four types. In the higher education setting, about half of the students were of the same behavioral type while the remaining students engaged in changes as compared with their behavior-based types in high school. Students‘ background characteristics and institutional environment demonstrated an association related to these shifts.
Full version
Better Together: How Faculty Development and Assessment Can Join Forces to Improve Student Learning
Kinzie, J., Landy, K., Sorcinelli, M. & Hutchings, P.
Change magazine, 51(5), 46-54, 2019.
Full version
Contextualizing effect sizes in the National Survey of Student Engagement: An empirical analysis
Rocconi, L. M., & Gonyea, R. M.
Research and Practice in Assessment, 13(Summer/Fall), 22-38, 2018.
The concept of effect size plays a crucial role in assessment, institutional research, and scholarly inquiry, where it is common with large sample sizes to find small relationships that are statistically significant. This study examines the distribution of effect sizes from institutions that participated in the National Survey of Student Engagement (NSSE) and empirically derives recommendations for their interpretation. The aim is to provide guidelines for researchers, policymakers, and assessment professionals to judge the importance of an effect from student engagement results. The authors argue for the adoption of the recommendations for interpreting effect sizes from statistical comparisons of NSSE data.
Full version
The dependability of the updated NSSE: A generalizability study
Fosnacht, K., & Gonyea, R. M.
Research and Practice in Assessment, 13(Summer/Fall), 62–74, 2018.
This study utilized generalizability theory to assess the context where the National Survey of Student Engagement‘s (NSSE) summary measures, the Engagement Indicators, produce dependable group-level means. The dependability of NSSE group means is an important topic for the higher education assessment community given its wide utilization and usage in institutional assessment and accreditation. We found that the Engagement Indicators produced dependable group means for an institution derived from samples as small as 25 to 50 students. Furthermore, we discuss how the assessment community should use NSSE data.
Full version
Refining an Approach to Assessment for Learning Improvement
Stitt-Bergh, M., Kinzie, J., & Fulcher, K.
Research and Practice in Assessment, , , 2018.
Full version
Information literacy's influence on undergraduates' learning and development: Results from a large multi-institutional study
Fosnacht, K.
In D. M. Mueller, (Ed.) At the helm: Leading transformation: The proceedings of the ACRL 2017 conference, March 22?25, 2017, Baltimore, Maryland Chicago, IL: Chicago, IL: Association of College and Research Libraries, 2017.
This paper investigated the reliability and validity of the National Survey of Student Engagement‘s Experiences with Information Literacy Topical Module, an assessment instrument developed in collaboration with a group of instructional librarians. After identifying three information literacy related constructs in the module, it assessed the relationship between the constructs and students‘ engagement in Higher-Order Learning and Reflective and Integrative Learning activities and students' perceived gains. The results from these analyses indicated that information literacy activities are positively and significantly correlated with student engagement and students‘ perceived gains.
Full version
How important are high response rates for college surveys?
Fosnacht, K., Sarraf, S., Howe, E., & Peck, L. K.
The Review of Higher Education, 40(2), 245–265, 2017.
Surveys play an important role in understanding the higher education landscape. However, declining survey participation rates threaten this source of vital information and its perceived utility. Although survey researchers have long assumed that the best way to obtain unbiased estimates is to achieve a high response rate, many survey researchers have begun to question the widely held assumption that low response rates provide biased results. Due to the prevalence of survey data in higher education research and assessment efforts, it is imperative to better understand the relationship between response rates and data quality. This study investigates this assumption with college student assessment data. It utilizes data from hundreds of samples of first-year and senior students with relatively high response rates using a common assessment instrument with a standardized administration protocol. It investigates how population estimates would have changed if researchers put forth less effort when collecting data and achieved lower response rates and respondent counts.
Full version
Indirect assessments in higher education
Nelson Laird, T. F. & BrckaLorenz, A.
In T. Cumming & M. D. Miller Enhancing Assessment in Higher Education: Putting Psychometrics to Work Stylus Publishing, LLC, 2017.
Assessment and accountability are now inescapable features of the landscape of higher education, and ensuring that these assessments are psychometrically sound has become a high priority for accrediting agencies and therefore also for higher education institutions. Bringing together the higher education assessment literature with the psychometric literature, this book focuses on how to practice sound assessment.
This volume provides comprehensive and detailed descriptions of tools for and approaches to assessing student learning outcomes in higher education. The book is guided by the core purpose of assessment, which is to enable faculty, administrators, and student affairs professionals with the information they need to increase student learning by making changes in policies, curricula, and other programs.
The book is divided into three sections: overview, assessment in higher education, and case studies. The central section looks at direct and indirect measures of student learning, and how to assure the validity, reliability, and fairness of both types. The first six chapters (the first two sections) alternate chapters written by experts in assessment in higher education and experts in psychometrics. The remaining three chapters are applications of assessment practices in three higher education institutions. Finally, the book includes a glossary of key terms in the field.
Full version
The use of student engagement findings as a case of evidence-based practice
Kinzie, J.
New Directions for Higher Education, 2017(178), 47–56, 2017.
The chapter considers student engagement to discuss the use of assessment evidence to advance evidence-based practice and to illustrate a scholarship of practice.
Full version
A lottery improves performance on a low-stakes test for males but not females
Cole, J. S., Bergin, D. A., & Summers, J.
Assessment in Education: Principles, Policy, and Practice, , 1–16, 2016.
The purpose of this study was to address the effectiveness of autonomy support and a lottery-based reward in enhancing test performance and test-taking motivation on a low-stakes test. Two hundred and forty-six university students were randomly assigned to three groups ? lottery, autonomy support and control ? and took a mathematics test. Students in the autonomy support and lottery group reported putting forth more test-taking effort than students in the control group. Males who were offered a chance at winning the financial reward scored almost 10 points higher on the exam compared to females. Results showed no significant difference in test scores for female students among groups, suggesting that neither intervention had an impact on females.
Assessment in student affairs (2nd ed.)
Schuh, J. H., Biddix, P., Dean, L. A., & Kinzie, J.
San Francisco, CA: Jossey-Bass, 2016.
A practical, comprehensive manual for assessment design and implementation,
Assessment in Student Affairs (Second Edition) offers a contemporary look at the foundational elements and practical application of assessment in student affairs. Higher education administration is increasingly called upon to demonstrate organizational effectiveness and engage in continuous improvement based on information generated through systematic inquiry. This book provides a thorough primer on all stages of the assessment process. From planning to reporting and beyond, you'll find valuable assessment strategies to help you produce meaningful information and improve your program. Combining and updating the thoroughness and practicality of Assessment in Student Affairs and Assessment Practice in Student Affairs, this new edition covers design of assessment projects, ethical practice, student learning outcomes, data collection and analysis methods, report writing, and strategies to implement change based on assessment results. Case studies demonstrate real-world application to help you clearly see how these ideas are used effectively every day, and end-of-chapter discussion questions stimulate deeper investigation and further thinking about the ideas discussed. The instructor resources will help you seamlessly integrate this new resource into existing graduate-level courses.
Student affairs administrators understand the importance of assessment, but many can benefit from additional direction when it comes to designing and implementing evaluations that produce truly useful information. This book provides field-tested approaches to assessment, giving you a comprehensive how-to manual for demonstrating?and improving?the work you do every day.
?Build your own assessment to demonstrate organizational effectiveness.
?Utilize quantitative and qualitative techniques and data.
?Identify metrics and methods for measuring student learning.
?Report and implement assessment findings effectively.
Accountability and effectiveness are the hallmarks of higher education administration today, and they are becoming the metrics by which programs and services are evaluated. Strong assessment skills have never been more important. Assessment in Student Affairs gives you the knowledge base and skill set you need to shine a spotlight on what you and your organization are able to achieve.
Reflections on the state of student engagement data use and strategies for action
Kinzie, J., Cogswell, C. A., & Wheatle, K. I. E.
Assessment Update, 27(2), 1–2, 14–15, 2015.
Although the National Survey of Student Engagement (NSSE) collects responses from hundreds of participating colleges and universities every year, its ultimate goal is not to collect data but to catalyze improvement in undergraduate education. Launched in 2000 by the Pew Charitable Trusts in response to growing national and local pressures for higher education to focus on measures of education quality and for colleges and universities to engage in meaningful improvement, the NSSE has become a leader in a campaign to focus attention on a number of relatively clear characteristics of effective environments for teaching and learning. The NSSE‘s process indicators related to good practices in undergraduate education provide diagnostic information about concrete activities that can guide interventions to promote improvement. By 2014, more than 1,500 institutions had participated in the NSSE, and over 4.5 million students had completed the questionnaire. In addition, the launch of two complementary instruments, the Faculty Survey of Student Engagement (FSSE) and the Beginning College Survey of Student Engagement (BCSSE), have furthered efforts to encourage the use of data for improvement by equipping institutions with information about faculty perceptions and entering students‘ expectations for engagement. Given these noble goals and all the student engagement data, what impact has the NSSE had on the use of data for improvement on campuses? And what lessons does this work suggest for the improvement agenda in higher education?
Full version
Are those rose-colored glasses you are wearing? Student and alumni survey responses
Dumford, A. D., & Miller, A. L.
Research and Practice in Assessment, 10, 5-14, 2015.
Lessons from the field—Volume 3: Using data to catalyze change on campus
National Survey of Student Engagement
Bloomington, IN: Center for Postsecondary Research, Indiana University School of Education, 2015.
Volume 3 of Lessons from the Field builds on insights from the earlier volumes illustrating the benefits of using NSSE results. Specifically, the highlighted institutional examples predominately feature the use of NSSE‘s updated measures and redesigned reports introduced with the survey‘s 14th administration in 2013. After more than three years of collaborative analysis, evidence-based item refinement, pilot testing, and student interviews, NSSE was revised to
incorporate content enhancements and customization
options that sustain the survey‘s relevance and value to participating institutions. The 25 institutional accounts featured in this volume illustrate how institutions are using results from the updated NSSE in assessment and improvement activities and in a variety of efforts
to address important campus needs and priorities.
Indeed, enlisting campus constituencies in the use
of assessment results is essential during a time of heightened demands for accountability and pressures to increase student persistence and completion, support diversity, and ensure high-quality learning for all students. Even more, improvement efforts at colleges and universities are more likely to succeed when they emerge from a shared understanding of the evidence and of the priorities for action.
Full version
Using National Survey of Student Engagement data and methods to assess teaching in first-year composition and writing across the curriculum
Paine, C., Anson, C., Gonyea, R. M., & Anderson, P.
In A. E. Dayton (Ed.) Assessing the teaching of writing: Twenty-first century trends and technologies Boulder, CO: University Press of Colorado, 2015.
In this chapter, we describe the origins, aims, and general structure of the NSSE (student engagement) and the Consortium for the Study of Writing in College (CSWC) (writing instruction) surveys. We describe how the CSWC was developed and offered and provide a brief overview of major findings of the national study. We describe how WPAs can adopt and adapt both the CSWC questions and the general approach to local needs. Finally, we describe some best practices (what to do and what to avoid) for using this approach, and we provide a few ideas for sharing results and making improvements.
Do good assessment practices measure up to the principles of assessment?
Kinzie, J., Jankowski, N., & Provezis, S.
Assessment Update, 26(3), 1–2, 14–16, 2014.
The American Association for Higher Education's Nine Principles of Good Practice for Assessing Student Learning (AAHE 1992) have appeared to stand the test of time, as evidenced by the fact that they are often referred to within the pages of Assessment Update and appear on various assessment websites and in texts (see Banta, Jones, and Black 2009). In fact, Hutchings, Ewell, and Banta (n.d.) reviewed the principles in 2010, declaring that they had ?aged nicely.? Looking back to 1992, the principles were conceived as a way to codify the responsible and effective conduct of assessment, advance assessment for educational improvement, and assist campuses to develop approaches that make a difference for students and their learning. The principles serve as a foundation for assessment practice. Guidelines for assessment continue to be promulgated, such as the New Leadership Alliance‘s Committing to Quality: Guidelines for Assessment and Accountability in Higher Education (2012), which intended to help institutions evaluate their assessment practices and to establish shared commitments among sectors of higher education; and the Principles for Effective Assessment of Student Achievement (Western Association for Schools and Colleges 2013), endorsed in July 2013 by six higher education associations and all regional accreditors, which succinctly expressed the value of assessment. The newer statements share tenets of the AAHE principles and also reflect specific organizational commitments. Similar to the AAHE Assessment Forum, the National Institute for Learning Outcomes Assessment (NILOA) has sought to move the needle on assessment efforts by surveying the landscape of assessment in higher education and by assisting institutions and others in discovering and adopting promising practices in the assessment of undergraduate student learning outcomes. Toward these ends, this article considers the most widely cited guidelines for effective assessment, namely, the AAHE Principles against the backdrop of NILOA‘s collection of accounts of good assessment practice. Simply put, how well do institutions‘ assessment activities align with stated principles for effectiveness?
One size does not fit all: Traditional and innovative models of student affairs practice
Manning, K. M., Kinzie, J., & Schuh, J. H.
New York, NY: Routledge, 2014.
In the day-to-day work of higher education administration, student affairs professionals know that different institutional types, whether a small liberal arts college, a doctoral intensive institution, or a large private university, require different practical approaches. Despite this, most student affairs literature emphasizes a "one size fits all" approach to practice, giving little attention to the differing models of student affairs practice and their diversity across institutions. In the second edition of this influential book, leading scholars Kathleen Manning, Jillian Kinzie, and John H. Schuh advocate an original approach by presenting 11 models of student affairs practice, including both traditional and innovative programs. Based on a qualitative, multi-institutional research project, One Size Does Not Fit All explores a variety of policies, practices, and programs that contribute to increased student engagement, success, and learning.
New to this revised edition:
Refinement of models in light of recent NSSE data and current developments in higher education, including budget cuts and the economic crisis;
updated information throughout about model assessment and techniques to renew divisions of student affairs;
a deeper analysis of how models of student affairs practice relate to institutional mission and purposes;
end-of-chapter discussion questions to guide thinking about ways to incorporate models in one‘s own context;
an entirely new Part IV, including chapters on "Catalysts and Tools for Change" and "Redesigning Your Student Affairs Division."
Refocusing the quality discourse: The United States National Survey of Student Engagement
McCormick, A. C., & Kinzie, J.
In H. B. Coates & A. C. McCormick (Eds.) Engaging university students: International insights from system-wide studies Singapore: Springer, 2014.
This chapter reports on work conducted with nearly 1,500 bachelor's degree-granting colleges and universities in the USA to assess the extent to which their undergraduates are exposed to and participate in empirically proven effective educational activities. The chapter begins with a discussion of the prevailing quality discourse in the USA. It then explores the conceptual and empirical foundations of student engagement and the origins of NSSE as both a response to the quality problem and as a diagnostic tool to facilitate improvement. The chapter also discusses tensions between internal improvement and external accountability efforts, and NSSE‘s role in the assessment and accountability movements. It concludes with a discussion of challenges that confront the project going forward.
Assessing learning spaces: Purpose, possibilities, approaches
Kinzie, J.
In J. L. Narum (Ed.) A guide: Planning for assessing 21st century spaces for 21st century learners Washington, DC: Learning Spaces Collaboratory, 2013.
The deep interest in knowing what would improve the quality of learning is driving assessment into every nook and cranny of colleges and universities. Colleges and universities are more accountable for educational effectiveness and for the performance of their students and graduates. Thus, concern about improving educational quality, coupled with the need for individual
campuses to demonstrate learning outcomes, has made assessment an unavoidable activity on campuses since the 1980s. Renewed efforts to enhance quality and increase persistence and success for all students?particularly under-represented minorities?has made it essential to collect evidence on a regular basis of the extent to which effectiveness has been achieved, evidence intended to mobilize attention to improving educational conditions in light of the findings. Assessment has always been a critical component in teaching and learning. Educators regularly assess at the individual student level, evaluating student work and giving grades, and some aggregate this information to guide improvements efforts at the level of an individual
course. Assessment also moves beyond the course when faculty consider strengths and weaknesses of students‘ work in relation to departmental learning goals. The department can then use these findings and other data, such as a graduating senior survey, to inform decisions about curriculum, pedagogy, and perhaps to prepare
for a specialized accreditation review or an institutional review. The demand for information from assessment has broadened its definition and purpose, now embracing the collection and analysis of student learning outcomes and other institutional outcomes, including cost-effectiveness, satisfaction, and the achievement of standards?all to determine the impact of educational programs, practices, and policies.
Good information in the right hands can be a vitally important lever for change. When done well, assessment can provide a foundation for wise planning, budgeting, improvements to the curriculum, pedagogy, staffing, programming, and ensuring that resources are dedicated to what is most effective.
NSSE benchmarks and institutional outcomes: A note on the importance of considering the intended uses of a measure in validity studies
Pike, G.
Research in Higher Education, 54(2), 149–170, 2013.
Surveys play a prominent role in assessment and institutional research, and the NSSE College Student Report is one of the most popular surveys of enrolled undergraduates. Recent studies have raised questions about the validity of the NSSE survey. Although these studies have themselves been criticized, documenting the validity of an instrument requires an affirmative finding regarding the adequacy and appropriateness of score interpretation and use. Using national data from NSSE 2008, the present study found that the NSSE benchmarks provided dependable means for 50 or more students and were significantly related to important institutional outcomes such as retention and graduation rates.
Linking the assessment of student engagement to student success
Gonyea, R. M., BrckaLorenz, A., & Ribera, T.
In G. McLaughlin, R. Howard, J. McLaughlin, & W. E. Knight (Eds.) Building bridges for student success: A sourcebook for colleges and universities Norman, OK: Consortium for Student Retention Data Exchange, 2013.
In this chapter, we explore the conceptual foundations and measurement of student engagement and share studies that link student engagement to student success. We also provide examples from several institutions that use student engagement measures in formative assessment to improve their students‘ learning experiences. Finally, we look at the researcher‘s role in effectively using engagement data to create a culture of evidence that documents student success.
Refreshing engagement: NSSE at 13
McCormick, A. C., Gonyea, R. M., & Kinzie, J.
Change: The Magazine of Higher Learning, 45(3), 6–15, 2013.
Thirteen years ago, 276 bachelor's-granting colleges and universities inaugurated a new approach to assessing college quality by participating in the first national administration of the National Survey of Student Engagement (NSSE). The timing was right. Policymakers were growing increasingly impatient with an ongoing yet unsustainable pattern of cost escalation, skepticism was building about how much students were learning in college, and regional accreditors were ratcheting up their demands on colleges and universities to adopt assessment for purposes of improvement.
Meanwhile, higher education's leaders were frustrated by the crude metrics dominating the discourse about college quality. It's been said that a dean at one of those early-adopting institutions enthusiastically proclaimed: ?Finally, a test I actually want to teach to!?NSSE introduced a simple yet effective reframing of the quality question: ask undergraduates about their educationally purposeful experiences. It incorporated several important design principles: emphasize behaviors that prior research found to be positively related to desired learning outcomes; emphasize actionable information?behaviors and experiences that institutions can influence; standardize survey sampling and administration to ensure comparability between institutions; provide participating institutions with comprehensive reports detailing their own students' responses relative to those at comparison institutions, plus an identified student data file to permit further analysis by the institution. NSSE was administered to first-year students and seniors, opening a window on quality at these ?bookends? of the undergraduate experience. In addition to reporting item-by-item results, the project created summary measures in the form of five ?Benchmarks of Effective Educational Practice? that focused attention on key dimensions of quality in undergraduate education: level of academic challenge, active and collaborative learning, student-faculty interaction, enriching educational experiences, and supportive campus environment. The new survey caught on fast. Annual participation now numbers 600?700 institutions, for a cumulative total of more than 1,500 colleges and universities in the US and Canada. What started as a bold experiment in changing the discourse about quality and improvement in undergraduate education?and providing metrics to inform that discourse?is now a trusted fixture in higher education's assessment landscape. High rates of repeat participation offer compelling testimony of the project's value. Of the first group of 276, 93 percent administered the survey in NSSE's tenth year or later. The Web-based survey is now offered as a census of first-year students and seniors, permitting disaggregated analyses by academic unit or demographic subgroup. In 2013, some 1.6 million undergraduates were invited to complete the survey, providing both valuable information for more than 620 participating campuses and a comprehensive look at student engagement across a wide variety of institutions. The 2013 administration marks the first major update of the survey since its inception. In the following pages, we summarize what we've learned over NSSE's first 13 years, why we're updating the survey, and new insights and diagnostic possibilities represented by these changes. Although NSSE's companion surveys, the Faculty Survey of Student Engagement (FSSE) and the Beginning College Survey of Student Engagement (BCSSE), are incorporating parallel changes, here we focus on the changes to NSSE.
Full version
The Degree Qualifications Profile: What it is and why we need it now
Jankowski, N., Hutchings, P., Ewell, P. T., Kinzie, J., & Kuh, G. D.
Change: The Magazine of Higher Learning, 45(6), 6–14, 2013.
There is no shortage of challenges facing postsecondary institutions in the US. One that cuts to the core of the enterprise is whether they are preparing their graduates to live productive, civically responsible lives in a dynamic global marketplace mapped onto diverse, yet increasingly interdependent, social and cultural systems. Much of the evidence presented in recent Change articles suggests that what undergraduate students know and are able to do falls well short of what employers, policymakers, and educational leaders say is needed.
Whether one accepts the available evidence as sufficient to draw such a sweeping conclusion ultimately rests on resolving two non-trivial issues. First, key stakeholders?those mentioned above and others, including students?must agree on the constellation of knowledge, skills, competencies, and dispositions that need to be signaled by postsecondary degrees and credentials if they are to be attuned to the demands of the times.
Various individuals and groups representing business and education have issued sets of preferred outcomes. Perhaps best known are the Essential Learning Outcomes promulgated by the Association of American Colleges and Universities' (AAC&U) Liberal Education and America's Promise (LEAP) campaign. But while there is considerable overlap in the attributes that various groups deem desirable, there is less agreement as to the expected level of proficiency associated with a given credential or degree (associate's, baccalaureate, and post-baccalaureate). Second, what evidence do we have that students have achieved the desired levels of proficiency in the respective outcome areas? Countries throughout the world?in Europe, Australasia, and Central and South America?have made considerable progress in addressing these challenges by developing degree qualifications frameworks that articulate what outcomes graduates of their colleges and universities should have, along with behaviorally anchored indicators and other measures that mark the extent to which students have acquired them. A handful of institutions in the US?such as Alverno College, the military academies, and Western Governor's University?have done something akin to this. But it is only recently that concerted efforts have been mounted to bring greater clarity and more widespread agreement about what credentials and degrees should represent by more precisely defining what college students in this country need to know and be able to do and at what level of proficiency. This paper is about the status and aspirations of one such effort, Lumina Foundation's Degree Qualifications Profile (DQP).Over the past 18 months, the staff of the National Institute for Learning Outcomes Assessment (NILOA) and their colleagues at Public Agenda and Lumina Foundation have collected information from faculty and staff at scores of colleges and universities across the country, as well as from participants at various conferences and convenings, about how the DQP is being used. The DQP authors will take this information into account as they prepare a revised iteration of the document, which should be available in 2014. Drawing from our work as members of the NILOA team and on the perspective of Peter Ewell as one of the DQP authors, what follows is a brief overview of the DQP's defining features, a summary of general trends in its use, brief descriptions of several projects, and an analysis of the DQP's implications for assessment. We conclude with some comments about the promise of the DQP for both individual institutions and for higher education writ large.
An engagement-based student typology and its relationship to college outcomes
Hu, S., & McCormick, A. C.
Research in Higher Education, 53, 738–754, 2012.
Using data from the 2006 cohort of the Wabash National Study of Liberal Arts Education, we developed a student typology based on student responses to survey items on the National Survey of Student Engagement. We then examined the utility of this typology in understanding direct-assessment learning outcomes, self-reported gains, grade-point average, and persistence from the first to second year of college. Results from linear and logistic regression models indicated there were relationships between student types and the various outcomes, and that an engagement-based student typology could help deepen our understanding of the college student experience and college outcomes.
Examining the relationship between student learning and persistence
Hu, S., McCormick, A. C., & Gonyea, R. M.
Innovative Higher Education, 37, 387–395, 2012.
Using data from the 2006 cohort of the Wabash National Study of Liberal Arts Education, we developed a student typology based on student responses to survey items on the National Survey of Student Engagement. We then examined the utility of this typology in understanding direct-assessment learning outcomes, self-reported gains, grade-point average, and persistence from the first to second year of college. Results from linear and logistic regression models indicated there were relationships between student types and the various outcomes, and that an engagement-based student typology could help deepen our understanding of the college student experience and college outcomes.
Investigating social desirability bias in student self-report surveys
Miller, A. L.
Educational Research Quarterly, 36(1), 30-47, 2012.
The frequent use of student self-report surveys in higher education calls into question the possibility of social desirability having an unwanted influence on responses. This research explores the potential presence of social desirability bias with the National Survey of Student Engagement (NSSE), a widely used assessment of student behaviors. Correlations between a short social desirability scale and NSSE benchmarks, subscales, and selected items suggest that the majority of scores have no significant relationship with a measure of social desirability. A series of regression models controlling for demographic variables produce similar results. Effect sizes and estimates of explained variance are also discussed.
Lessons from the field—Volume 2: Moving from data to action
National Survey of Student Engagement
Bloomington, IN: Center for Postsecondary Research, Indiana University School of Education, 2012.
In this publication we highlight approaches different types of institutions have taken to improve the undergraduate experience. Because NSSE focuses on student behavior and effective educational practice, colleges and universities have found many productive ways to use survey results: accreditation self-studies, benchmarking, curricular reform, faculty and staff development, grant writing, institutional research, retention, and state system comparisons.
The stories about data use illustrate various ways that assessment can be a worthwhile undertaking when meaningful data are generated and discussed with a wide campus audience, and results are used to inform efforts to improve educational effectiveness. Understanding how colleges and universities use results and achieve
improvements in undergraduate education is important.
to advancing systemic improvement in higher
education. The examples in this volume provide ample
inspiration for encouraging institutions to move from
collecting data to taking action.
Full version
Clearing the air about the use of self-reported gains in institutional research
Gonyea, R. M., & Miller, A.
New Directions for Institutional Research, 2011(150), 99–111, 2011.
Correlations between self-reported learning gains and direct, longitudinal measures that ostensibly correspond in content area are generally inadequate. This chapter clarifies that self-reported measures of learning are more properly used and interpreted as evidence of students' perceived learning and affective outcomes. In this context, the authors supply evidence that social desirability bias in such self-assessments does not constitute a significant concern. Recommendations for use of self-reported gains in research and institutional assessment are discussed.
It’s about time: What to make of reported declines in how much college students study
McCormick, A. C.
Liberal Education, 97(1), 30–39, 2011.
A substantial body of research affirms the commonsense notion that involvement in academic work and quality of effort pay off: the more students engage in educationally purposeful activities, the more they learn. An important element is how much time students invest in studying. Yet while time is important, it is increasingly clear that how students spend their study time also matters. Spending many hours memorizing facts in order to perform well on an exam may earn a good grade, but it is not likely to result in long-term retention or the ability to apply what was learned in novel situations. A recent longitudinal analysis of student performance on the open-ended performance task of the Collegiate Learning Assessment, administered to the same students at the beginning of the first year and at the end of the sophomore year, found that hours spent studying alone corresponded to improved performance, but hours spent studying with peers did not. While we should not ignore the importance of how study time is used, this article focuses on the simple question of how much full-time college students study, whether study time has declined, and if so, what may account for the decline.
Full version
Assessment for advancement
McCormick, A. C.
CASE Currents, 36(3), 11–12, 2010.
Perspectives from campus leaders on the current state of student learning outcomes assessment.
Kinzie, J.
Assessment Update, 22(5), 1–2, 14–15, 2010.
The assessment of student learning outcomes is of keen interest to the federal government, accrediting bodies, and education associations and policymakers. Colleges and universities have been under increased pressured to demonstrate accountability for student learning and be more transparent about dimensions of educational quality. Although institutions are responding to these demands, it is not altogether clear where learning outcomes assessment ranks in importance on institutions‘ action agenda, or the extent to which colleges and universities are using assessment results to make real improvements in the quality of student learning. The National Institute for Learning Outcomes Assessment (NILOA) is a multiyear effort to understand and further the student learning outcomes agenda nationally. One of NILOA‘s primary activities is tracking the journey of higher education institutions responding to the challenge of outcomes assessment. To this end, NILOA conducts surveys, focus groups, and case studies to learn more about what colleges and universities are doing to assess student learning and how they are using the results. This paper highlights lessons from four focus group sessions with campus leaders ? presidents, provosts, academic deans and directors of institutional research from a variety of two- and four-year institutions ? regarding their perspectives on the state of learning assessment practices on their campuses. The perceptions are considered in relation to findings from the 2009 NILOA survey report, More Than You Think, Less than We Need: Learning Outcomes Assessment in Higher Education. The perspectives of campus leaders provide first-hand accounts of a range of student learning outcomes activities on campus and help contextualize results from the 2009 NILOA survey. Focus group findings illustrate the extent to which assessment has taken hold on campus, explicate the role of accreditation and the responsibility of faculty in student learning outcomes assessment, and showcase how assessment has been furthered on campuses. The institutional examples of innovative assessment practices, particularly those that involve faculty in meaningful ways and lead to institutional improvements, and the promising ways that assessment has been woven into administrative structures and processes, are instructive for advancing understanding of what is happening on the ground at colleges and universities. The paper concludes by articulating questions and challenges raised by campus leaders including reservations about identifying and using assessment measures, issues of transparency and communicating results, and concerns about financing assessment. As the demand for greater emphasis on student learning outcomes assessment intensifies, it is important to document both the successes and challenges associated with campus efforts to respond. Campus leaders provide an important perspective on what is most likely to help assessment efforts grow and deepen in institutions.
Student engagement and a culture of assessment
Kinzie, J.
In G. Kramer & R. Swing (Eds.) Higher education assessments: Leadership matters Lanham, MD: Rowman Littlefield, 2010.
The author asserts that many campuses have made significant advances in assessment practices that drive improvements in student learning and success. This chapter introduces a framework for assessment based on student engagement and success, then highlights nine characteristics of campus assessment activities associated with improvements to student learning. For convenience, characteristics are grouped according to their focus on 1. strong leadership, 2. inclusive involvement, or 3. outcomes-based program function. Most institutions still find it hard to use evidence for systematic improvement, and few report having well-developed assessment plans to sustain a culture of assessment. Fortunately, effective and inspirational models have been developed by many institutions that have adopted an effective student engagement framework for assessing and improving the conditions for students' success leading to improvements in their learning. In this chapter, the lessons learned from 20 educationally effective institutions are reexamined in light of information from additional institutions that have advanced a framework for student engagement and success.
Effectively involving faculty in the assessment of student engagement
Nelson Laird, T. F., Smallwood, R., Niskod-Dossett, A. S., & Garver, A. K.
New Directions for Institutional Research, 2009(141), 71–81, 2009.
Lessons from the field—Volume 1: Using NSSE to assess and improve undergraduate education
National Survey of Student Engagement
Bloomington, IN: Center for Postsecondary Research, Indiana University School of Education, 2009.
Assessment is a worthwhile undertaking when
meaningful data are generated, evidence-based
improvement initiatives are thoroughly considered and discussed, and results are ultimately used to improve educational effectiveness. NSSE results are oriented toward such practical use. Each year, more campuses use their NSSE results in innovative ways to improve the undergraduate experience. In this publication we highlight the approaches different types of institutions
have taken to move from data to action.
Full version
The use of engagement data in accreditation, planning, and assessment
Banta, T. W., Pike, G. R., & Hansen, M. J.
New Directions for Institutional Research, 2009(141, Special Issue), 21–34, 2009.
Using NSSE in institutional research
Gonyea, R. M., & Kuh, G. D. (Eds.)
New Directions for Institutional Research, 2009(141, Special Issue), , 2009.
Full version
Collecting survey data for assessment: A practice brief based on BEAMS project outcomes
Kinzie, J.
Washington, DC: Institute for Higher Education Policy, 2008.
Full version
Increasing student participation in NSSE: Two success stories
Kinzie, J.
Assessment Update, 18(2), 4–6, 2006.
Full version
Nonresponse bias in student assessment surveys: A comparison of respondents and non-respondents of the National Survey of Student Engagement at an independent comprehensive Catholic university
McInnis, E. D.
, 2006.
Student success in college: Why it matters and what institutions can do about it
Kuh, G. D., Kinzie, J., Schuh, J. H., & Whitt, E. J.
, , 2006.
Full version
Promoting student success: Creating conditions so every student can learn
Chickering, A. W., & Kuh, G. D.
Bloomington, IN: Indiana University Center for Postsecondary Research, 2005.
Accommodating diverse learning styles of students has long been espoused as a principle of good practice in undergraduate education. Much progress has been made during the past
two decades in using active, collaborative, and problem-based learning, learning communities,
student-faculty research, service learning, internships, and other pedagogical innovations to enrich student learning. Variable time
blocks are more common--from three hours, to all day, to weekends, to six or eight weeks--to fit the desired outcomes, content, and
processes. Peers tutor other students, deepening their own learning in the process. Increasingly
sophisticated communication and information technologies provide students access to a broad range of print and visual resources and to an
expanded range of human expertise. A wider range of assessment tools document what and how well students are learning. Despite all this activity, at too many schools these and other effective educational practices are underutilized. The suggestions offered here are drawn in large part from a study of 20 diverse four-year colleges and universities that have higher-than-predicted graduation rates and, through the National Survey of Student Engagement, demonstrated that they have effective practices for fostering success among students of differing abilities and aspirations. These institutions clearly communicate that they value high quality undergraduate teaching and learning. They have developed instructional approaches tailored to a wide range of student learning styles, ensuring that students engage with course content and interact in meaningful ways with faculty and peers, inside and outside the classroom.
Full version
Putting student engagement results to use: Lessons from the field
Kuh, G. D
Assessment Update, 17(1), 12–13, 2005.
Full version
Principles for assessing student engagement in the first year of college
Hayek, J. C., & Kuh, G. D.
Assessment Update, 16(2), 11–13, 2004.
Full version
The contributions of the research university to assessment and innovation in undergraduate education
Kuh, G. D.
W. E. Becker & M. L. Andrews (Eds.)The scholarship of teaching and learning in higher education: The contributions of research universities Bloomington, IN: Indiana University Press, 2004.
A longitudinal assessment of college student engagement in good practices in undergraduate education
Kojaltic, M., & Kuh, G. D.
Higher Education, 42, 351–371, 2001.
Faculty-student affairs collaboration on assessment: Lessons from the field
Kuh, G. D., & Banta, T. W.
About Campus, 4(6), 4-11, 2000.
Another look at the fourth edition of the CSEQ
Kuh, G. D.
Assessment Update, 11(2), 13, 16, 1999.
The effects of entering characteristics and instructional experiences on student satisfaction and degree completion: An application of the input-environment-outcome assessment model
House, J.
International Journal of Instructional Media 26 no4 423-34, , 1999.
Strengthening assessment for academic quality improvement
Ewell, P. T.
In M. W. Peterson, D. D. Dill, L. A. Mets, & Associates Planning and management for a changing environment San Francisco, CA: Jossey-Bass, 1997.
Assessment in practice: Putting principles to work on college campuses
Banta, T. W., Lund, J. P., Black, K. E., & Oblander, F. W. (Eds.).
San Francisco, CA: Jossey-Bass, 1996.
Collaboration between general education and the major
Saint Joseph College
Assessment Update, 8(2), , 1996.
Editor's notes: Bob Pace tells us what students do while in college
Banta, T. W.
Assessment Update, 8(1), 3, 13, 1996.
How standards-based high school assessment can affect admission to colleges and universities
Griffith, F. A.
Progress, Trends, and Practices in Higher Education, 8 (1), 1-15, , 1996.
Making a difference: Outcomes of a decade of assessment in higher education
Banta, T. W., & Associates
San Francisco, CA: Jossey-Bass, 1993.
Assessing the undergraduate experience
Pace, C. R.
Assessment Update, 2(3), 1, 2, 4, & 5, 1990.
Assessment measures
Pike, G. R.
Assessment Update, 2(1), 8-9, 1990.
Outcomes, assessment, and academic improvement: In search of usable knowledge
Ewell, P. T.
In J. Smart (Ed.) Higher education: Handbook of theory and research New York, NY: Agathon, 1988.
Historical perspectives on student outcomes assessment with implications for the future
Pace, C. R.
NASPA Journal, 22(2), 10-18, 1984.
Successful student outcomes assessment: Six institutional case studies including the role of student affairs
Beeler, K. J., Benedict, L. & Hyman, R.
(Available from the National Association of Student Personnel Administrators, 1875 Connecticut Avenue NW, Suite 418, Washington, DC 20009), , 0.
The use of student engagement findings as a case of evidence-based practice
Kinzie, J.
New Directions for Higher Education, 2017 Summer(178), 47–56, 0.
The chapter considers student engagement to discuss the use of assessment evidence to advance evidence-based practice and to illustrate a scholarship of practice.
Full version
The use of student engagement findings as a case of evidence-based practice
Kinzie, J.
New Directions for Higher Education, 2017 Summer(178), 47–56, 0.
The chapter considers student engagement to discuss the use of assessment evidence to advance evidence-based practice and to illustrate a scholarship of practice.
Full version
Worth The Squeeze: What Learning Improvement Is and Why It Matters
Kern, J.A. Kinzie, J & Fulcher, K.H.
, 0.
Full version
Scholarly Papers
Bringing their perspective to campus: Students’ experiences with inclusive courses and diverse environments
Kinzie, J., & BrckaLorenz, A.
Association for the Study of Higher Education Annual Conference, Tampa, FL, 2018, November.
How much do students experience courses that emphasize sharing their own perspectives or respecting diverse ideas? This study uses data from a multi-institution survey to explore student experiences with inclusive courses and perceptions of institutional commitment to diversity and discuss a dozen campus responses to their institutional assessment results.
Full version
Typology of students: A view from student transition from high school to college
Mu, L., & Cole, J.
American Educational Research Association Annual Meeting, San Antonio, TX, 2017, May.
Several recent studies have successfully identified several college student types. One limitation of past studies has been their reliance on one-time cross sectional assessments. As a result, we are left to ponder the stability or consistency of student behaviors as the academic year progresses. This study uses longitudinal data of student engagement to investigate the stability of student engagement typology. Guided by behavioral consistency theory, this study explores the supportive elements of educational settings in order to find those under which students‘ behavior-based types are more likely to change. Results showed that there are generally four student types based on their engagement in a variety of activities. In higher education settings, most students stick to a pattern of behaviors while a small portion changed compared with their engagement types in high school. Students‘ background characteristics and institutional environment showed association with these shifts.
Full version
An alternative approach: Using survey panels to inform assessment
Sarraf, S., Fernandez, S., Houlemarde, M., & Wang, X.
Association for Institutional Research Annual Forum, Denver, CO, 2015, May.
As an experiment, eight mini-surveys based on selected items from the National Survey of Student Engagement (NSSE) were administered to 500 college students over an eight-week period. NSSE staff recruited participants from five diverse colleges and universities in order to investigate this alternative survey panel approach to see what impact it would have on various data quality indicators. Results indicate a dramatic increase in student participation rates and less missing data from those who responded.
Full version
Contextualizing student engagement effect sizes: An empirical analysis
Rocconi, L., & Gonyea, R. M.
Association for Institutional Research Annual Forum, Denver, CO, 2015, May.
The concept of effect size?a measure of the strength of association between two variables?plays a crucial role in assessment, institutional research, and scholarly inquiry, where it is common with large sample sizes to find small or even trivial relationships or differences that are statistically significant. Using the distributions of effect sizes from the results of 984 institutions that participated in the National Survey of Student Engagement (NSSE) in 2013 and 2014, the authors empirically derived new recommendations for the interpretation of effect sizes which were grounded within the context of the survey. We argue for the adoption of new values for interpreting small, medium, and large effect sizes from statistical comparisons of NSSE Engagement Indicators, High-Impact Practices, and student engagement data more generally.
Full version
Survey lottery incentives and institutional response rates: An exploratory analysis
Sarraf, S., & Cole, J.S.
Association for Institutional Research Annual Forum, Orlando, FL, 2014, May.
Many institutional and educational researchers are well aware that response rates for assessment surveys have been declining over the past few decades (Dey, 1997; Laguilles, Williams, & Saunders, 2011). As a result, many researchers have noted that our ability to adequately assess student academic experiences, satisfaction, engagement, use of campus resources, and other
important topics in higher education are at risk (Pike, 2008). Consequently, use of incentives are one tool that many institutional researchers have come to rely on to boost or hold steady their response rates for various campus student surveys. Though research regarding the efficacy of incentives to boost survey response rates in higher education is scant, the research that does exist suggests that incentives are an effective way to boost institutional response rates (Heerwegh, 2006; Laguilles, Williams, & Saunders, 2011). The purpose of this study is to investigate the efficacy of lottery incentives (the mostfrequently used incentive approach) to boost responses rates for institutions using the National Survey of Student Engagement (NSSE).
Full version
Taking surveys with smartphones: A look at usage among college students
Sarraf, S., Brooks, J., & Cole, J.
American Association for Public Opinion Research Annual Conference, Anaheim, CA, 2014, May.
The widespread adoption of mobile technologies has dramatically impacted the landscape for
survey researchers (Buskirk & Andrus, 2012), and those focusing on college student populations are no exception. The National Survey of Student Engagement (NSSE), one of the largest U.S. college survey assessment projects, annually surveys hundreds of thousands of undergraduate students at college and university campuses throughout the United States and Canada. Internal NSSE analyses show the number of smartphone respondents is increasing each year.1 This analysis showed that in 2011, only about 4% of
NSSSE respondents used a smartphone, but by 2013 that figure had increased to 13%. Preliminary
results from the 2014 administration suggest the percentage continues to increase, with roughly 18% of respondents using smartphones to complete the survey. Using 2013 NSSE data, the purpose of this study is to examine college student demographics and engagement results by smartphone respondent status. The results of this study will provide insights into the prevalence of college?aged survey respondents using smartphones, and the impact this technology has on survey responses.
Full version
Faculty emphasis on diversity conversations and conversations with diverse others
BrckaLorenz, A., Nelson Laird, T., & Shaw, M.
AAC&U Modeling Equity, Engaging Difference Conference, Baltimore, MD, 2012, October.
Using data from the Faculty Survey of Student Engagement (FSSE), this study examines how
often faculty structure class sessions around diverse topics and how often faculty report students having serious conversations with diverse others in their courses. Findings suggest that faculty most often structure course sessions around economic and social inequalities and report students having the most conversations with people of differing economic or social backgrounds. Faculty members‘ gender and race matter in predicting these measures of diversity in the classroom, but disciplinary area was the strongest predictor. Implications for assessment and institutional research are discussed.
Full version
Digging deeper into institutional data: Enhancing campus assessment findings with the FSSE report builder
Cole, E. R., Nelson Laird, T. F., & Shaw, M. D.
Association for Institutional Research Annual Forum, New Orleans, LA, 2012, June.
Full version
Faculty emphasis on diversity topics and conversations with diverse others
Nelson Laird, T. F., Shaw, M. D., Cole, E. R., BrckaLorenz, A., & Cervera, Y.
Association for Institutional Research Annual Forum, New Orleans, LA, 2012, June.
Using data from the Faculty Survey of Student Engagement (FSSE), this study examines how
often faculty structure class sessions around diverse topics and how often faculty report students having serious conversations with diverse others in their courses. Findings suggest that faculty most often structure course sessions around economic and social inequalities and report students having the most conversations with people of differing economic or social backgrounds. Faculty members‘ gender and race matter in predicting these measures of diversity in the classroom, but disciplinary area was the strongest predictor. Implications for assessment and institutional research are discussed.
Full version
Relationship between faculty perceptions of institutional participation in assessment and faculty practices of assessment-related activities
Haywood, A. M., Shaw, M. D., Nelson Laird, T. F., & Cole, E. R.
American Educational Research Association Annual Meeting, New Orleans, LA, 2011, April.
Full version
Assessment for improvement: Faculty perceptions of institutional participation in assessment by field
Haywood, A. M., Shaw, M. D., & Nelson Laird, T. F.
Association for the Study of Higher Education Annual Conference, Indianapolis, IN, 2010, November.
Combining the National Survey of Student Engagement with student portfolio assessment
Stoering, J. M., & Lu, L.
Association for Institutional Research Annual Forum, Kansas City, MO, 2002, June.
Presentations
Assessing College Student Mental Health and Well-Being: Implications for Campus Support and Promising Practice
Kinzie, Jillian; BrckaLorenz, Allison; Chambers, Tony; Huber, Susan; Yuhas, Bridget
AACU Annual Meeting, Washington, DC, 2024, January.
Concern about college students?? mental health and the need for institutions to provide more support are on the rise. This session will draw from three research projects??Student Well-Being Institutional Support Survey (SWISS), National Survey of Student Engagement (NSSE) Mental Health & Well-Being module, and the Center for Healthy Minds and Human Flourishing course??to combine assessments and expand understanding of student mental health, perceptions of support, and effective interventions. We will highlight the most current and important findings about college student mental health and well-being, discuss implications for colleges and universities and considerations for equity, and exchange ideas about campus interventions that show the most promise.
Full version
A Quantitative Review of Faculty Practices and Perceptions of the Scholarship for Teaching and Learning
Braught, Emily; BrckaLorenz, Allison
Scholarship of Teaching and Learning Summit, 2024.
How do faculty perceive and interact with scholarship for teaching and learning? This session will review findings from the 2022 and 2023 Faculty Survey of Student Engagement (FSSE), exploring the extent to which classroom- and institutional-level assessment efforts are used to make improvements and hone teaching practices, the extent to which faculty collaborate and build community with one another to share out teaching practices, and the extent to which external motivations influence faculty frequency of practices related to the scholarship of teaching and learning.
Full version
Beyond Demographics: Incorporating Equitable and Inclusive Language about Student Identities in Surveys
Miller, Angela; BrckaLorenz, Allison; Kilgo, Cindy Ann; Priddie, Christen; Wenger, Kevin; Zhu, Yihan
Assessment Institute, Indianapolis, IN, 2023, October.
A new workgroup within the National Survey of Student Engagement (NSSE) focuses on Equity in Survey Design, Administration, Analysis, and Reporting (ESDAR). The workgroup has made changes to survey items for the 2023 administration. These revisions were aimed at more inclusive and equitable language, particularly related to items asking about student identities such as gender identity, sexual orientation, race/ethnicity, first-generation status, and Greek life participation. Attendees will learn about the rationale behind these revisions, and be asked to reflect on whether their own institutional assessments use equitable and inclusive language.
Full version
Faculty Members are Not the Problem: Improving Faculty Teaching Environments to Foster Teaching Excellence
BrckaLorenz, Allison; Nelson Laird, Tom
Assessment Institute, Indianapolis, IN, 2023, October.
Given challenges with technology, the pressures of the academy, political meddling in higher education, inequitable conditions, and students whose needs are complex and changing rapidly, faculty members find themselves struggling with workload, their own health issues, competing priorities, and how to be effective teachers in a challenging time. Using data from two large-scale multi-institution assessment projects, we invite you to examine with us aspects of faculty teaching environments that contribute to faculty members' success as educators. By using measures of, for example, institutional policies and processes, access to instructional resources, and institutional climates for diversity, we will illustrate how a better understanding of the teaching environment can improve faculty development efforts. Join us for an exchange of ideas about ways to foster environments that motivate teaching excellence and support faculty in both their work and personal lives.
Full version
Building on Tradition: Approaches to More Inclusive Data Analysis
BrckaLorenz, Allison; Hu, Tien-Ling
Association for Institutional Research Forum, Cleveland, OH, 2023, May.
Institutional research and assessment depends heavily on our ability to characterize the students we study into categories and on our inclination to generalize the results. Although this work is necessary for understanding student experiences, it does present challenges for critical and inclusive approaches to data analysis. In this session, we will discuss common issues and solutions associated with inclusive data analysis by investigating a series of data analysis examples that feature small sample sizes for marginalized students. We will discuss traditional variable-centered versus person-centered methodological approaches, strategies for creating groups to use in comparative analyses, challenges in quantitatively capturing aspects of identity, and tips for communicating the results, validity, and data quality of such analyses to broad audiences.
Full version
Gathering Evidence for an Assessment of Environments That Motivate Teaching Excellence
BrckaLorenz, Allison; Brandon, Josclynn; Hu, Tien-Ling; Priddie, Christen; Nelson-Laird, Thomas F.
American Educational Research Association Annual Meeting, San Diego, CA, 2022, April.
Inequities and discrimination within the systems and structures of higher education prevent faculty from doing and receiving recognition for their best work as educators. The purpose of this study is to present the validation testing and overview of results from a new project designed to help institutions understand the teaching environments in their local context and for researchers to understand teaching environments in higher education. The assessment instrument examined here guides our understanding of faculty needs, motivations, and supports that are necessary for healthy teaching environments and the wellbeing of diverse faculty. Findings from this study add to our knowledge of faculty teaching cultures as well as provide an example of how to collect validity evidence for climate assessment instruments
Full version
Assessing Sense of Belonging for Student Success: New Findings from NSSE
BrckaLorenz, Allison; Lofton, Colleen; Kinzie, Jillian
Assessment Institute, 2021, October.
Sense of belonging influences student persistence and success. NSSE 2020 findings from 521 bachelor??s granting colleges and universities show most first-year students feel comfortable being themselves and feel valued and a part of the community at their institution, yet notable differences were found for traditionally marginalized subpopulations. This session will provide an overview of findings through an interactive discussion and publicly available data visualization. Facilitators will provide examples of how institutions have used their data to assess and impact belongingness. Participants will identify actions their institution can take to influence the sense of belonging on their campus for marginalized student populations.
Full version
NSSE's 3rd Decade: Highlighting New Emphases in Assessment and Student Engagement
BrckaLorenz, Allison; Cole, Jim; Gonyea, Robert; Kinzie, Jillian; McCormick, Alex; Sarraf, Shimon
Assessment Institute, 2021, October.
The National Survey of Student Engagement (NSSE) is excited to enter our 3rd decade of assessment to improve educational quality and student outcomes. This session will highlight NSSE's suite of surveys – the Faculty Survey of Student Engagement (FSSE) and Beginning College Survey of Student Engagement (BCSSE) and new emphases, including survey items on effective teaching and sense of belonging and data visualization tools. We'll also introduce enhancements including Topical Modules to assess inclusiveness and cultural diversity, advising, and quality in online education and HIPs
Full version
All NSSE session recordings from the 2020 Assessment Institute are now available for viewing.
Assessment Institute, 2020, October.
- 05A Plenary Session and Panel Discussion
Teresa Leyba Ruiz, Glendale Community College; Stephen P. Hundley, IUPUI, Keston H. Fulcher, James Madison University; Natasha Jankowski, National Institute for Learning Outcomes Assessment (NILOA) and University of Illinois Urbana-Champaign; Jillian Kinzie, Indiana University?Bloomington and NILOA; Verna F. Orr, University of Illinois Urbana-Champaign and NILOA; Hamsa Marikar, Watermark - 07B* Advancing Institutional Assessment: Lessons from Excellence in Assessment 2020 Designees
Jillian Kinzie, Indiana University?Bloomington and National Institute for Learning Outcomes Assessment (NILOA); Frank Hall, Northwestern State University of Louisiana; Kristen Springer Dreyfus, East Carolina University (ECU); and Rebecca Lewis and Diane Waryas Hughey, The University of Texas at Arlington - 08D Dealing with Tough Moments: Assessing Faculty Preparation for Teaching Challenges
Kyle T. Fassett and Allison BrckaLorenz, Indiana University-Bloomington; and Sarah S. Hurtado, University of Denver - 09O NSSE?s 3rd Decade: Synthesizing Contributions and Highlighting New Emphases in Assessment and Student Engagement
Jillian Kinzie, Indiana University?Bloomington and National Institute for Learning Outcomes Assessment (NILOA); Robert Gonyea and Alexander McCormick, Indiana University-Bloomington - 12A Assessing HIP Quality: Evidence from the Literature and Students? Experience
Jillian Kinzie, Indiana University?Bloomington and National Institute for Learning Outcomes Assessment (NILOA); Brendan Duggan, Robert Gonyea, Alexander McCormick, and Samantha Silberstein, Indiana University-Bloomington - 13A Assessing the Faculty Role in High-Impact Practices
Kyle T. Fassett, Allison BrckaLorenz, and Thomas F. Nelson Laird, Indiana University-Bloomington - 14J Assessment with the Improvement of Student Learning as the End Goal
Jillian Kinzie, Indiana University?Bloomington and National Institute for Learning Outcomes Assessment (NILOA); Kathleen Gorski, Waubonsee Community College; Kathleen Gorski, Natasha Jankowski, National Institute for Learning Outcomes Assessment (NILOA); and Monica Stitt-Bergh, University of Hawai?I at M?noa - 15C Developing Culturally Responsive Multiple Assessments of Student Learning in Diversity-Inclusion-Social Justice (DISJ) Core Courses and National Survey Results (CECE and NSSE) to Advance Campus Conversations
Jesse Mills, Carole Huston, Paula Krist, and Antonieta Mercado, University of San Diego - 16B A Conceptual Framework and Strategies for Examining High-Impact Practices
Kyle T. Fassett, Indiana University-Bloomington - 18A Equity and Inclusivity in the Assessment of High-Impact Practices
Heather Haeger, California State University, Monterey Bay; and Allison BrckaLorenz, Indiana University-Bloomington - 20D Better Together: How Student Learning Outcomes Assessment and Faculty Development Can Partner to Strengthen Student Success
Pat Hutchings, National Institute for Learning Outcomes Assessment (NILOA) and Bay View Alliance (BVA); and Jillian Kinzie, Indiana University?Bloomington and National Institute for Learning Outcomes Assessment (NILOA)
Assessing the Faculty Role in High-Impact Practices
Fassett, Kyle T.; BrckaLorenz, Allison; Nelson Laird, Thomas F.
Assessment Institute, 2020, October.
High-impact practices are effective educational practices leveraged to improve student outcomes, and often faculty members are key to their facilitation. This session examines faculty roles in emphasizing students? participation and engaging students in these practices. We will share characteristics of faculty who encourage and partake in these activities with an emphasis on assessment practices for gathering more information about faculty experiences with high-impact practices. Attendees will have the opportunity to learn from one another through sharing their own challenges and successes in measuring faculty involvement in high-impact practices and how they cultivate a culture of high-impact experiences on campus.
Full version
Dealing with tough moments: Assessing faculty preparation for teaching challenges
Fassett, Kyle T.; BrckaLorenz, Allison; Hurtado, Sarah
Assessment Institute, 2020, October.
Faculty are increasingly placed in situations where they have to navigate difficult teaching situations (student incivility, disclosure of sensitive information, controversial events, etc.) and challenging conversations with students (sexual assault, racism, mental health, etc.). As such, it is important to examine faculty preparation for managing such situations, what strategies they implement when they encounter these concerns, and what related training they wish they had received. Findings from a large-scale quantitative and qualitative study of teaching challenges will guide a discussion about assessing and supporting faculty efforts to navigate difficult teaching situations through professional development programming.
Full version
Equity and Inclusivity in the Assessment of High-Impact Practices
Heager, Heather; BrckaLorenz, Allison
Assessment Institute, 2020, October.
As we seek to assess what works and what needs improvement in High-Impact Practices (HIPs), it is essential that we critically examine our assessment strategies and methodologies. Many assessment plans inadvertently overlook the experiences of diverse student populations and only focus on large, aggregate data that reflects the experiences of traditionally privileged, majority students. This session will focus on how to conduct more inclusive assessment, including specific strategies for: creating more inclusive survey questions, how to assess the experiences of small populations in order to improve the experiences of small populations, more equitable quantitative methods, and democratically engaged assessment strategies.
Full version
Survey Inclusivity: Centering Minoritized Groups in Survey Design
Priddie, Christen; BrckaLorenz, Allison
Assessment Institute, 2020, October.
The emergence of critical quantitative methodologies emphasizes the importance of using race-conscious approaches to highlight the centrality of race in student experiences, as a neutral approach can support a deficit framing. This session explores why it is important to center racially minoritized groups in survey design in order to move toward more equitable assessments of their experiences. Discussion will focus on an example of intentionally centering Black college student experiences in a quantitative study of collaborative learning and perceptions of campus climate with tips and strategies for participants to center racially minoritized voices in their own assessments.
Full version
What's next for student engagement and institutional assessment
Kinzie, J.; BrckaLorenz, A.; Gonyea, R.; Kirnbauer, T.; Sarraf, S.
Association for Institutional Research Annual Forum, 2020, May.
Over the past 20 years, the National Survey of Student Engagement (NSSE) has helped institutional researchers gain insight into institutional quality. Further, a shifting higher education landscape requires institutional researchers to reconsider how assessment is conducted. NSSE continues to search for innovative ways to understand student engagement and serve as a valuable assessment tool for institutions. In this session, we will focus on three areas that impact institutional assessment: changing student demographics, developments in teaching and learning, and innovative approaches to assessment. The session will conclude with an active discussion with participants about the trends and possibilities at their institutions.
Full version
Student engagement and institutional assessment: Current trends and future possibilities
McCormick, A., Kinzie, J., BrckaLorenz, A., Gonyea, R., Sarraf, S.
Association of American Colleges & Universities Annual Meeting, Washington, DC, 2020, January.
With the National Survey of Student Engagement marking its 20th year, we invite participants to explore the trends and possibilities of engagement as a lens for examining institutional quality. This session, conducted as a research town hall, will focus on three broad areas:
1. Changing demographics?What are the emerging demographics on your campus? How will new kinds of students challenge and shape what we know about engagement?
2. Developments in teaching and learning?Do current measures of engagement adequately cover what is essential to the improvement of student learning? What new forms of engagement should we assess?
3. Approaches to assessment?What trends should a large-scale survey assessment project consider over the next decade to facilitate evidence-informed improvement?
After briefly introducing each topic, the presenters will facilitate an active discussion with participants about the trends and possibilities at their institutions and in higher education overall.
Full version
Educational environments for faculty: Improving postsecondary teaching through assessment
Strickland, J., BrckaLorenz, A., Fassett, K., Nelson Laird, T.
Professional and Organizational Development Network Annual Conference, Pittsburgh, PA, 2019, November.
This session focuses on understanding the relationship between faculty members' educational environments and their teaching practices. Results from a large-scale, multi-institutional study give insight into these environments by documenting faculty sense of departmental belonging, collegial commitment to quality teaching, and access to resources to meet high standards. Participants will engage in a conversation about how to best assess educational environments, differences across faculty characteristics, and how to create momentum for change.
Full version
20 years of student engagement: Insights about students, assessment, and college quality
Kinzie, J., Gonyea, R., & McCormick, A.
Assessment Institute 2019, Indianapolis, IN, 2019, October.
In 2020 the National Survey of Student Engagement enters its third decade assessing the quality of undergraduate learning and success. In 20 years, the student engagement movement has surely changed our notions of quality in higher education. Most institutions now value a culture of evidence, promoting deep approaches to learning, developing high-impact practices, and tracking engagement indicators. This session reviews the most important findings about student engagement in the past two decades, and asks participants to consider what engagement will look like in the next decade. What is next for assessing quality in undergraduate education and collecting evidence for improvement?
Full version
Assessing environmental factors that promote quality collegiate teaching
BrckaLorenz, A., Nelson Laird, T., Fassett, K., Hiller, S., & Strickland, J.
Assessment Institute 2019, Indianapolis, IN, 2019, October.
As the need to improve undergraduate education intensifies, assessment of student and faculty practices should be complemented by information about the environmental conditions that help faculty members do their best work. This session focuses on understanding the relationship between faculty educational environments and their teaching practices. Results from a large-scale, multi-institutional study give insight into these environments by documenting faculty sense of departmental belonging, collegial commitment to quality teaching, and access to resources to meet high standards. Session participants will engage in a conversation about how to best assess educational environments, and how to create momentum for change.
Full version
Assessing the quality of undergraduate living arrangements: Relationships with engagement and persistence
Fosnacht, K., Gonyea, R., Fassett, K., & Graham, P.
Assessment Institute 2019, Indianapolis, IN, 2019, October.
This presentation overviews research findings from the National Survey of Student Engagement's living arrangements study. It discusses student persistence, the sophomore experience, roommate matching policies, and living-learning communities.
Full version
Challenges and benefits of cross sectional assessment
Fassett, K. & BrckaLorenz, A.
Assessment Institute 2019, Indianapolis, IN, 2019, October.
Repeated assessments are intended to generate knowledge about changes occurring on campuses to make informed decisions; however, data often lay dormant, not reaching their full potential beyond a single use. This session clarifies the differences between the uses of longitudinal studies and cross?sectional examinations and discusses strategies for using cross?sectional data to discover trends in student outcomes. Examples include investigations of teaching practices and student engagement over time. Examples will be provided for using both multi?institutional and single institution data sets.
Full version
Getting beyond the label: What makes high-quality HIPs, how widespread are they, and who has access to them?
McCormick, A., Kinzie, J., Gonyea, R., Dugan, B., & Silberstein, S.
Assessment Institute 2019, Indianapolis, IN, 2019, October.
High-impact practices (HIPs) represent a core feature of a high-quality undergraduate education and are often hailed as life-changing events. The literature identifies a set of essential elements common across HIPs, yet to date most evidence about HIPs has been limited to student participation in designated HIPs, with scant empirical examination of their implementation. We report on a multi-institution study of students?? exposure to these elements of quality in six HIPs (learning communities, service-learning, research with faculty, study abroad, internships and field experiences, and culminating senior experiences) to deepen understanding of HIP quality and which students have access to high-quality HIPs.
Full version
Getting lost at the crossing? Tips for assessing intersectional experiences
BrckaLorenz, A., Fassett, K., Kirnbauer, T., & Washington, S.
Assessment Institute 2019, Indianapolis, IN, 2019, October.
Faculty and administrators are often tasked with educating the whole student upon arrival at college, so it is important to understand ways to assess the whole student. This session will discuss factors to consider when quantitatively examining intersecting aspects of students?? identities, student characteristics, and collegiate endeavors. Case studies will provide examples of challenges and strategies for better understanding ways to assess and better understand the experiences of students with intersecting identities. Attendees will discuss their own challenges and solutions for intersectional analyses and leave with tangible takeaways for their work.
Full version
The next decade of HIPs: Increasing access, quality, and equity
Kinzie, J.
Assessment Institute 2019, Indianapolis, IN, 2019, October.
Many colleges and universities promote the value of high-impact practices (HIPs)--such as service-learning, undergraduate research, internships, and study abroad--and evidence demonstrates that students benefit. Yet simply offering such activities does not guarantee high-quality learning or even participation. Over the past decade, we??ve learned the importance of intentionally designed HIPs that are delivered with fidelity, but shaped to the context, a defined purpose and inclusive pedagogy. This session briefly reviews the HIP landscape, introduces new findings about HIP quality and equity, and suggests practical approaches to scaling and ensuring the quality of HIPs.
Full version
Innovations in teaching: A flipped classroom narrative
Fassett, K., Strickland, J., & BrckaLorenz, A.
Society for Teaching and Learning in Higher Education Annual Conference, Winnipeg, Manitoba, Canada, 2019, June.
The session will offer insights from hundreds of faculty teaching at 18 four-year colleges and universities who have applied flipped classroom techniques in their selected courses. Using data collected by the 2018 Faculty Survey of Student Engagement, presenters will share findings on the types of courses that faculty tend to flip as well as ways in which they structure the delivery of their course content. Session participants will also learn about the challenges and benefits faculty experienced in flipping a course and the reasons why they turned to this pedagogical approach. The general purpose of this session is to inspire thoughtful and strategic planning for faculty and offer an example of how assessment professionals may gauge institutional support for innovative teaching practices.
Full version
Celebrating NSSE's 20th: Making the most of student engagement data
Gonyea, R. M., & Kinzie, J.
Association for Institutional Research Annual Forum, Denver, CO, 2019, May.
In this session, Bob Gonyea and Jillian Kinzie share NSSE‘s achievements, highlight effective institutional reporting and data uses, and preview plans for assessment in the project‘s next several years.
Full version
Crises and considerations for assessment
Fassett, K., & BrckaLorenz, A.
Association for Institutional Research Annual Forum, Denver, CO, 2019, May.
Crises have been studied showing large-scale negative effects on aspects of higher education institutions. However, there has been limited discussion or research about how they appear to influence the operation of institutional research. Given the heightened climate on campuses over the last several years, it is important to consider how crises alter our work. This discussion group begins a conversation about how to prepare, endure, and reflect on ways to analyze data when there may be discontinuity in assessment practices due to crisis.
Full version
Entering students' experience: BCSSE for first-year and transfer students
Cole, J., & Kinzie, J.
Annual Conference on the First-Year Experience, Las Vegas, NV, 2019, February.
Beginning College Survey of Student Engagement (BCSSE) can now be used to survey your first-year, transfer, and older students. Since 2007, nearly 900,000 entering first-year students at more than 500 institutions have completed BCSSE. The updated web survey now includes questions specifically for incoming transfer and older students. This session will describe how data about entering first-year, transfer, and older students provides comprehensive information about your students‘ experiences. Institutions use BCSSE for academic advising, retention models, faculty and staff development, and other assessment needs. This session will present the new survey, revised reports, and details regarding fall and winter administrations.
Full version
Assessing faculty experiences teaching a flipped course
Ribera, A., BrckaLorenz, A., Fassett, K., & Strickland, J.
Assessment Institute, Indianapolis, IN, 2018, October.
The session explores flipped classrooms as an innovative pedagogical practice. Facilitators in this session will offer insights from hundreds of faculty who have applied flipped classroom techniques in their courses. Presenters will share findings on the types of courses that faculty tend to flip as well as ways in which they structure the delivery of their course content. Session participants will also learn about the challenges and benefits faculty experienced in flipping a course and why they turned to this pedagogical approach. Discussion will focus on how assessment professionals may gauge institutional support for innovative teaching practices.
Full version
Assessing for diversity: Evidence from NSSE’s Inclusiveness and Engagement with Cultural Diversity and Global Learning modules
Kinzie, J., & McCormick, A. C.
Assessment Institute, Indianapolis, IN, 2018, October.
The assessment of inclusivity and cultural responsiveness and of global learning are current imperatives for higher education. The National Survey of Student Engagement (NSSE) recently added two new Topical Modules asking students more about inclusive educational practices and perceptions of their global learning experiences. This session highlights findings from these question sets, examines common items for course-based learning and how results vary by institution and student characteristics, what results suggest about global learning practice and inclusivity, and includes a discussion about campuses' use of these findings to create environments that support all students and leverage the educational benefits of diversity and internationalization.
Full version
Support by any other name: Disaggregating supportive environments for faculty
Priddie, C., Silberstein, S., & BrckaLorenz, A.
Assessment Institute, Indianapolis, IN, 2018, October.
This session aims to provide a deeper understanding of the importance of disaggregating data to improve campus environments for minoritized faculty members. Responses from faculty members at approximately 30 institutions who participated in the Inclusiveness and Engagement with Cultural Diversity topical module of the Faculty Survey for Student Engagement will be used to examine how identity and discipline influence differing perspectives of supportive environments. Participants will have the opportunity to learn about approaches to working with disaggregated data and discuss ways in which supportive environments can be improved for different faculty populations.
Full version
Using a typology of faculty to assess undergraduate education and plan for faculty development
BrckaLorenz, A., Fassett, K., & Nelson Laird, T.
Assessment Institute, Indianapolis, IN, 2018, October.
In this session, participants will learn about the relationships between a typology of faculty members and measures of effective educational practice. The typology comes from faculty responses on the time they spend on teaching activities; research, creative, or scholarly activities; and service activities from over 24,000 faculty at 154 institutions that participated in the 2017 administration of the Faculty Survey of Student Engagement (FSSE). After an interactive presentation of findings, participants will discuss the implications for assessing undergraduate education and planning for faculty development at their campuses.
Full version
Using a typology of faculty to assess undergraduate education and plan for faculty development
BrckaLorenz, A., Fassett, K., & Nelson Laird, T.
Assessment Institute, Indianapolis, IN, 2018, October.
In this session, participants will learn about the relationships between a typology of faculty members and measures of effective educational practice. The typology comes from faculty responses on the time they spend on teaching activities; research, creative, or scholarly activities; and service activities from over 24,000 faculty at 154 institutions that participated in the 2017 administration of the Faculty Survey of Student Engagement (FSSE). After an interactive presentation of findings, participants will discuss the implications for assessing undergraduate education and planning for faculty development at their campuses.
Full version
Assessing inclusiveness and engagement in cultural diversity and global learning: Lessons from NSSE’s Topical Modules
Kinzie, J., Cavallo, J., & Kenesson, S.
Higher Education Data Sharing Annual Conference, Spokane, WA, 2018, June.
The assessment of inclusivity and cultural responsiveness and global learning are current imperatives for higher education. The National Survey of Student Engagement (NSSE) added two new modules asking students more about inclusive educational practices and perceptions of their global learning experiences. Despite articulating goals to advance globalization and diversity, institutions have sometimes fallen short in the ways they have enacted these goals. Further, students‘ perceptions of institutional commitment to these goals vary and this in turn influences their experience. Therefore, it is important to understand the specific ways in which institutions enact their commitments and what influences students‘ perceptions of these commitments. This session highlights findings from these sets, examines how results vary by student characteristics, what results suggest about global learning practice and inclusivity. Discussion will then focus on campuses‘ use of these findings to create environments that support all students and to leverage the educational benefits of diversity and internationalization, including for example, to make the case for a more integrated campus diversity plan and a more comprehensive campus climate survey, and to spur further analyses of the impacts of students‘ perceptions of institution‘s emphasis on diversity and supportive environment on student success rates (e.g., academic standing, retention).
Full version
Assessing small populations: Recognizing everyone counts in your counts
BrckaLorenz, A., & Hurtado, S.
Student Affairs Assessment and Research Conference, Columbus, OH, 2018, June.
Quantitative and survey research depends heavily on large sample sizes, but a focus on the ?average student? in quantitative analyses often hides diverse voices. Participants in this session will discuss common issues and solutions associated with giving voice to small populations of college students (e.g., gender variant, multiracial, LGBQ+). Participants will discuss administration issues related to small populations such as increasing response rates, identifying special subpopulations, and writing more inclusive survey questions. Tips for disaggregating, responsibly aggregating, and choosing inclusive comparative information will be provided. Additionally, participants will discuss strategies for analyzing and communicating about the results from small populations as well as approaches for communicating about the validity and data quality from small sample sizes.
Full version
Revisiting the connection between high-impact practices and student activism
Dugan, B., & Morgan, D.
Civic Learning and Democractic Engagement Meeting, Anaheim, CA, 2018, June.
High-impact practices (HIPs) are often viewed as primary ways to help college and universities achieve a range of educational outcomes, including preparing students to participate in democracy. Utilizing new data from the NSSE, this session will help educators explore the connection between HIPs and student activism toward huancing and updating their understanding of the relationship between these concepts.
Full version
Maximizing survey data for outreach, assessment, programming, and beyond
Miller, A. L., & Dumford, A. D.
Association for Institutional Research Annual Forum, Orlando, FL, 2018, May.
This presentation provides a variety of real-life examples of how institutions have used survey data collected from students, faculty, and alumni within multiple contexts. Examples are drawn from institutions participating in the National Survey of Student Engagement (NSSE), the Beginning College Survey of Student Engagement (BCSSE), the Faculty Survey of Student Engagement (FSSE), and the Strategic National Arts Alumni Project (SNAAP). The types of data use cover numerous categories: sharing on campus; recruitment; academic and career advising; publicity, alumni relations, and donor outreach; planning, assessment, and accreditation; program and curricular change; and advocacy and public policy. Attendees will learn about ways that they can optimize the use of available survey data for many different audiences, allowing the institutional research office to serve as a bridge that connects other stakeholders with available data.
Full version
Maximizing survey data for outreach, assessment, programming, and beyond
Miller, A. L., & Dumford, A. D.
Association for Institutional Research Annual Forum, Orlando, FL, 2018, May.
This presentation provides a variety of real-life examples of how institutions have used survey data collected from students, faculty, and alumni within multiple contexts. Examples are drawn from institutions participating in the National Survey of Student Engagement (NSSE), the Beginning College Survey of Student Engagement (BCSSE), the Faculty Survey of Student Engagement (FSSE), and the Strategic National Arts Alumni Project (SNAAP). The types of data use cover numerous categories: sharing on campus; recruitment; academic and career advising; publicity, alumni relations, and donor outreach; planning, assessment, and accreditation; program and curricular change; and advocacy and public policy. Attendees will learn about ways that they can optimize the use of available survey data for many different audiences, allowing the institutional research office to serve as a bridge that connects other stakeholders with available data.
Full version
Ticking away the moments: Assessing faculty roles with time on task
BrckaLorenz, A., Nelson Laird, T., Fassett, K., & Yuhas, B.
Association for Institutional Research Annual Forum, Orlando, FL, 2018, May.
More frequent calls for accountability in higher education have led to increased scrutiny on what students are doing and learning while in college. Because faculty are important contributors to the student experience, the ability to realistically analyze how faculty spend their time engaging students in learning is a key component in being able to answer these calls for accountability. The purpose of this presentation will be to examine and discuss how faculty time on task can be used to enhance a wide variety of conversations about faculty roles, development, contributions, and productivity. A new method of analyzing faculty productivity will be presented, and participants will discuss how such an assessment of faculty time can be useful in discussions about institution mission and goals, faculty roles at the institution, faculty professional development, and faculty tenure and promotion.
Full version
Transparent quality: Framing and building a psychometric portfolio
Paulsen, J., & BrckaLorenz, A.
Association for Institutional Research Annual Forum, Orlando, FL, 2018, May.
High quality data and assessment instruments have become essential for institutional researchers to take a data-driven approach to informing decision-making and strategic planning. Instruments and the resulting data they collect can be studied for different aspects of validity and reliability as well as the procedures and standards used to reduce error, bias, and increase the rigor of the data. This presentation will focus on a framework for operationalizing and organizing a wide variety of studies to investigate data quality. Participants in this session will see how a large-scale quantitative survey project designed and created a psychometric portfolio with studies designed to make survey instrument and data quality transparent so that higher education leaders, researchers, and professionals can trust the results. Challenges and potential solutions, including thinking about strategies for conducting studies of data quality with limited time and resources will be discussed.
Full version
Emerging research on queer-spectrum and trans-spectrum students in higher education
Greathouse, M., BrckaLorenz, A., Hoban, M., Rankin, S., & Stolzenberg, E.
American Educational Research Association Annual Meeting, New York, NY, 2018, April.
Queer-spectrum and trans-spectrum students remain a significantly underserved population within higher education, despite the presence of significant disparities across measures of campus climate, academic engagement, and overall health. This paper explores the campus climate, overall health, and academic engagement of queer-spectrum and trans-spectrum undergraduate students attending four-year colleges and universities in the US through an analysis of seven national data sets, including the 2017 data sets of the National Survey of Student Engagement (Center for Postsecondary Research, Indiana University Bloomington), the 2016 Undergraduate Student Experience at the Research University Survey (SERU-AAU Consortium, Center for Studies in Higher Education, University of California-Berkeley and University of Minnesota Twin Cities), the 2016 American College Health Association--National College Health Assessment, and the 2016 data sets of four surveys conducted by the Cooperative Institutional Research Program, including The Freshman Survey (TFS), the Your First College Year Survey (YFCY), the Diverse Learning Environments Survey (DLE), and the College Senior Survey (CSS) (University of California-Los Angeles, Higher Education Research Institute).
Full version
Assessing inclusiveness and engagement with cultural diversity: Assuring success for all
Kinzie, J., McCormick, A., Gonyea, R., & BrckaLorenz, A.
Association of American Colleges and Universities Annual Meeting, Washington, DC, 2018, January.
Institutional support for diversity, inclusivity, and cultural responsiveness represents an imperative for higher education given demographic projections and the needs of a pluralist society. In 2017, the National Survey of Student Engagement (NSSE) added an optional question set asking students more about inclusive teaching practices in courses, intercultural learning, and perceptions of their institution‘s cultural responsiveness. This session highlights findings from this item set, discusses the relationship between these activities and other effective educational practices, examines how these relationships vary between traditionally marginalized students and more privileged students as well as by major field, and includes a discussion of the opportunities and challenges educators face as they seek to improve inclusion, engagement with diversity, and cultural responsiveness. Discussion includes how campus leaders can use these findings to create environments that more fully support students of all backgrounds, leverage the educational benefits of diversity, and promote transformative learning outcomes.
Full version
Assessing diversity inclusivity in college courses: Updates and trends
Nelson Laird, T. F., Hurtado, S. S., & Yuhas, B. K.
Assessment Institute, Indianapolis, IN, 2017, October.
Using results from multiple administrations of the Faculty Survey of Student Engagement (FSSE), participants in this session will examine how courses include diversity, what faculty and course characteristics predict that inclusion, and whether results have varied over time. The results come from survey items based on a comprehensive framework describing how nine course elements (e.g., purpose, content, assessment) vary in their inclusion of diversity. Session participants will learn about the framework and results and also will engage with the facilitators to discuss the implications of the results for those working to assess the inclusion of diversity across the curriculum.
Full version
Engagement insights: Applying NSSE to student affairs assessment
Kinzie, J., Ribera, A., & Hurtado, S.
Assessment Institute, Indianapolis, IN, 2017, October.
Student affairs is under pressure to improve student success and demonstrate the effectiveness of programs and contributions to student outcomes. One practical approach to address this issue is for student affairs assessment professionals to take advantage of available data and assessment resources such as the National Survey of Student Engagement (NSSE). In this session, participants will learn about recent findings about student engagement, persistence, and student learning relevant to student affairs; will practice applying existing student engagement data to inform strategic goals and initiatives related to the co-curriculum; and will exchange ideas about effective approaches to using NSSE data in student affairs.
Full version
Exploring disciplinary differences in global engagement and learning
Kinzie, J., McCormick, A. C., & Nelson Laird, T. F.
AAC&U 2017 Global Engagement and Social Responsibility Conference, New Orleans, LA, 2017, October.
Session facilitators will describe disciplinary differences in global engagement and learning (GEL) by sharing results from NSSE‘s Global Learning Topical Module. Through active exchange?including anticipating results, engaging with presented results, and discussion with others in the session?participants will learn about 1) students‘ exposure to global and international topics; 2) students‘ engagement with global issues in and outside of the classroom; and 3) students‘ perceptions of how much their institution facilitated their development along several dimensions of global and intercultural competence. Participants will also discuss their reactions to the findings and share what they are doing on their campuses to assess GEL and how findings from their assessment efforts inform accreditation activities. Presenters will end the session by sharing information and resources NSSE staff have compiled to aid in the assessment of global engagement and intercultural competence.
Full version
What does an engaging campus look like? The role of surveys in the assessment of student engagement
Gonyea, R. M.
Association for the Promotion of Campus Activities Annual Staff & Student Leadership on Broadway Experience, New York, NY, 2017, July.
If you walked onto the most engaging campus, what would you see? The contribution of out-of-class experiences to student engagement cannot be overstated. Interest in creating the conditions that enhance student learning and success is at an all-time high. Today‘s student affairs professional knows how to use observable evidence to effectively plan, implement, assess, and improve outcomes. Student engagement surveys provide some of that evidence for institutions wishing to make student achievement, satisfaction, persistence, and learning a priority.
Full version
Applying NSSE findings to improve student persistence
Kinzie, J.
NASPA Assessment and Persistence Conference, Orlando, FL, 2017, June.
In 2016, the National Survey of Student Engagement (NSSE) released findings about engagement practices influencing retention and graduation as well as results about the importance of learning support to first-year retention. This session will focus on these recent NSSE findings in an interactive presentation that invites participants to consider results in relation to their persistence practices as well as apply NSSE institutional results. We will also exchange ideas about assessment approaches to dig deeper on critical issues for student success.
Full version
How residence life professionals can use engagement data
Hurtado, S. S., Graham, P. A., & Gonyea, R. M.
ACUHO?I Annual Conference, Providence, RI, 2017, June.
Student affairs professionals (including residence life) are often expected to incorporate assessment and evidence-based practices in their work. One approach is to take advantage of existing data and assessment resources such as from the National Survey of Student Engagement (NSSE). This session introduces NSSE's administration, data, and reports, and demonstrates how staff can use engagement data to promote improvements within residence life.
Full version
Paying attention to often ignored small subpopulations in assessment work
BrckaLorenz, A., & Nelson Laird, T. F.
Association of American Colleges and Universities Annual Meeting, San Francisco, CA, 2017, January.
A more diverse society has led to a more diverse college-going population and faculty body, but the need for restoring public trust in higher education is especially important for subpopulations that have traditionally been marginalized within the higher education system. Often these groups represent small proportions of an overall population, which can present a variety of challenges when trying to conduct assessments of their experiences. This session explores the challenges and possible solutions for those working toward improving the experiences of small subpopulations. The session will consist of highly interactive discussions focusing on the value of inclusivity in restoring public trust in higher education, reflections on assessing the experiences of small subpopulations, and creating plans for further understanding the experiences of small subpopulations for the purpose inclusive improvement.
Full version
Faculty use of rubrics: An examination across multiple institutions
Thomas F. Nelson Laird, John Zilvinskis, & Polly A. Graham
Assessment Institute, Indianapolis, IN, 2016, October.
Assessment professionals identify rubrics as key tools in measuring student learning; however, the
field of higher education lacks a clear picture of how much faculty use these tools as well as the ways rubrics are developed and used. For assessment professionals and faculty who work to improve undergraduate education, better understanding rubric development and use should enhance their ability to assist faculty members and ultimately improve teaching and learning on campuses. Relying on data from the Faculty Survey of Student Engagement (FSSE), we will describe rubric implementation and development across over 20 institutions and discuss the implications of our findings.
Full version
Using NSSE results to inform campus plans to expand high-impact practices and assess impact
Kinzie, J., & Zilvinskis, J.
Assessment Institute, Indianapolis, IN, 2016, October.
Many campuses are planning to increase students‘ participation in high-impact practices. This session will explore how National Survey of Student Engagement (NSSE) institutional reports and annual results can be used to inform institutional efforts to plan, design, and assess HIPs, and will include considerations regarding entering students‘ expectations for HIPs, differences in participation by student
characteristics, and patterns of participation by major. Working through a case study, we will discuss how NSSE results can inform campus design and evaluation of HIPs.
Full version
Gender identity and sexual orientation: Survey challenges and lessons learned
BrckaLorenz, A., Clark, J., & Hurtado, S.
Association for Institutional Research Annual Forum, New Orleans, LA, 2016, June.
Research shows there are differences in the college experience for students from underrepresented backgrounds, including non-heterosexual and gender variant students. This is due in part to experiences of discrimination and negative campus climate for these students. Participants in this session will learn about and discuss the assessment of and conversations about gender identity and sexual orientation on other campuses, and the challenges and potential solutions for writing more inclusive survey questions about complex identities. Challenges and potential strategies for surveying, disseminating results, and talking about difficult or sensitive topics on college campuses will also be discussed. Finally, participants will learn about the engagement, perceptions of campus support, and satisfaction, of students with varying gender identities and sexual orientations from a longitudinal, large-scale, multi-institution survey of students at four-year colleges and universities.
Full version
Graduate student surveys: Assessment landscape, challenges, and solutions
BrckaLorenz, A., Yuhas, B., & Nelson Laird, T.
Association for Institutional Research Annual Forum, New Orleans, LA, 2016, June.
Most assessments of the graduate student experience are institution-based and largely focused on satisfaction exit surveys. Very few of these surveys touch on the experiences of graduate students as instructors or teaching assistants. Further lacking are trends from large-scale surveys or national data on the experiences of graduate students, which is especially concerning given the increasing number of graduate students entering the classroom to teach undergraduate students. Participants in this session will discuss how the assessment of graduate student experiences varies on different campuses, as well as examine the challenges and potential solutions associated with assessing these experiences. Finally, participants will learn about one large-scale survey of graduate student experiences, see a selection of results gathered from two years of administration of this survey, and hear about how institutions have used these results for graduate student professional development purposes.
Full version
Using NSSE as a catalyst for improvement: Lessons from the field
Kinzie, J., Owens, S., & Du, F.
Association for Institutional Research Annual Forum, New Orleans, LA, 2016, June.
One of the more challenging phases of assessment is taking action on results. This session will explore the latest field-tested lessons from nearly two dozen institutions that have successfully used the National Survey of Student Engagement (NSSE) to improve undergraduate education. Representatives from two institutions will discuss their use of data to improve the first-year experience, and to engage departments in enhancing student learning by creating dashboard displays, infographics, and customized reports. The session will provide an opportunity to learn about approaches employed by institutions that have made effective use of results, and to discuss proven strategies for taking action.
Full version
Patterns of effective teaching practice for general education and non-general education courses
BrckaLorenz, A., & Nelson Laird, T. F.
AAC&U General Education & Assessment Conference, New Orleans, LA, 2016, February.
With over a decade of data collection and hundreds of institutional participants in the Faculty Survey of Student Engagement (FSSE), much can be learned about the engaging educational practices within general education courses at a variety of institution types and educational contexts. In this session, facilitators will use FSSE data to compare the degree to which instructors of general education courses and non-general education courses emphasize various forms of student engagement. Goals of the session include examining these comparisons within different campus or disciplinary contexts, discussing the goals of general education in promoting student engagement, and reflecting on opportunities and challenges in seeking to improve or examine student engagement within general education courses.
Full version
Are seniors ready for the "real world"? Transitions, plans, and differences by major field
Miller, A. L., & Dumford, A. D.
Assessment Institute, Indianapolis, IN, 2015, October.
Miller, A.L., & Dumford, A.D. (2015, October 27). Are seniors ready for the ?real-world?? Transitions, plans, and differences by major field. Session presented at the 2015 Assessment Institute, Indianapolis, Indiana.
A recent focus in higher education has been the lack of preparedness that graduates face upon entering the workforce. Coupled with criticisms of low income levels in certain major fields, institutions are looking to reconcile skill development and career advising. Utilizing new 2015 module questions from the National Survey of Student Engagement (NSSE), this presentation provides findings concerning the career plans of graduating seniors and their readiness to use a variety of skills and abilities. Several trends are also revealed when looking at the results by major field, suggesting the need for some curricular revisions and enhanced career services.
Full version
Assessing the experiences and practices of faculty and graduate students who teach
BrckaLorenz, A., Nelson Laird, T., & Ribera, A.
Assessment Institute, Indianapolis, IN, 2015, October.
Assessing the experiences and teaching practices of faculty and graduate students can be particularly challenging. This session examines the assessment of faculty and graduate students who teach undergraduates by sharing the experiences and challenges of institutions participating in large-scale surveys of faculty and graduate students‘ teaching practices. Participants will be encouraged to share their ideas, challenges, and solutions associated with assessing faculty and graduate students who teach. Ideas from the presenters as well as from session participants will be used to create brief action plans for improving the assessment and experiences of faculty and graduate students who teach.
Full version
Examining student leadership as a high-impact practice
Gonyea, R. M., & Zilvinskis, J.
Assessment Institute, Indianapolis, IN, 2015, October.
Assessment professionals are often asked to measure the quality of co-curricular activities. For example, when a student holds a formal leadership role in a student organization, what does that look like? How does that experience relate to learning and development? High-impact practices (HIPs)?characterized by their intensity, collaboration, and effectiveness?have gained attention in the assessment world. This session explores the quality of student leadership experiences and what it might take to label them ?high-impact.? The presenters will identify populations that are more or less likely to participate in HIP-level leadership and will recommend how educators can enhance leadership experiences.
Full version
NSSE's new reports and tools for exploring your data
Gonyea, R. M.
Assessment Institute, Indianapolis, IN, 2015, October.
After a general review of new measures on the updated NSSE (National Survey of Student Engagement), this session guides users through NSSE‘s many useful reports and online tools that facilitate evidence-based assessment.
Full version
Getting to use: What stimulates and impedes use of student engagement results?
Kinzie, J., McCormick, A., Olsen, D., Blaich, C., & Wise, K.
Association for Institutional Research Annual Forum, Denver, CO, 2015, May.
The ultimate goal of assessment projects, including the National Survey of Student Engagement (NSSE), is not to gather data. It‘s to catalyze improvement in undergraduate
education. Yet, moving from data to campus action is challenging. This session addresses the challenges of data use, blending expert panelist insights with focused audience
discussion about what stimulates and impedes action. With the updated NSSE in mind, panelists and the audience
consider broad topics about using evidence, including sharing results, anticipating evidence use, striving for perfect data, involving students, and planning for action, and also
discuss what promotes effective data use.
Full version
Making the most of NSSE: A detailed overview of survey updates, customization options, reports, and assessment applications
Ribera, A., Rocconi, L., & Sarraf, S.
Association for Institutional Research Annual Forum, Denver, CO, 2015, May.
This workshop will help extend institutional research professionals‘ use of the updated survey and include a review of survey content, new customization options, reporting, and assessment opportunities. The goal of this workshop is to help IR professionals make a seamless transition to using and maximizing the benefits of the updated NSSE, and to exchange ideas about approaches to using student engagement results.
Full version
Assessing global learning to improve student learning and educational practice
Braskamp, L., Kinzie, J., & Reason, R.
Association of American Colleges & Universities Annual Meeting, Washington, DC, 2015, January.
What college experiences are most influential in fostering elements of global learning? How can educators create a campus ethos and learning opportunities that encourage student global and holistic learning? Authors of three national assessment tools?Global Perspective Inventory (GPI), National Survey of Student Engagement (NSSE),and Personal and Social Responsibility Inventory (PSRI)?will present assessment strategies that connect student experiences, engagement,and campus climate with specific student learning outcomes, especially those related to global learning and global citizenship. They will highlight evidence from the use of these tools that focus on environmental
conditions such as student experiences and perceptions of the campus community that enhance global learning. They will engage the audience
in considering the kind of evidence that would help them adapt curricula and co-curricular activities so that more students develop a deeper
understanding of global cultures, developments, and interconnections, across a variety of majors and career goals.
Full version
The updated NSSE: Fresh opportunities to engage faculty in assessment results
Dueweke, A., Hutchings, P., Kinzie, J., & McCormick, A.
Association of American Colleges & Universities Annual Meeting, Washington, DC, 2015, January.
The updated National Survey of Student Engagement (NSSE) provides greater specificity on measures that matter for improving student learning. Yet, too often results reside at the institution-level, demonstrating grand measures of educational quality, and only occasionally getting to faculty to influence teaching and learning practice. The promise of assessment depends on growing and deepening faculty involvement and use of results. This session explores the question: ?What do NSSE results mean for faculty?? Panelists and participants will address the topic and discuss ways to leverage student engagement results to inform instruction and efforts to enhance high-impact practices, guide faculty development initiatives, and connect to the scholarship of teaching and learning and projects to improve educational quality.
Full version
Assessing faculty members' and graduate student instructors' engagement in and views about professional development
Harris, J., Nelson Laird, T., & BrckaLorenz, A.
Assessment Institute, Indianapolis, IN, 2014, October.
This session aims to document current uses and needs regarding professional development for senior faculty, new faculty, and graduate student instructors (GSIs). Findings from faculty members at approximately twenty institutions that participated in the Faculty Survey of Student Engagement (FSSE) and from GSIs at eight institutions that participated in the pilot of FSSE for Graduate Student Instructors (FSSE-G) are utilized to identify impactful methods of professional development as well as potential areas for improvement. The goal of the session is to help participants understand ways they may assess faculty and GSI experiences with professional development in order to foster improvement.
Full version
Exploring high-impact practices using NSSE data, reports, and modules
Kinzie, J., & Ribera, A.K.
Assessment Institute, Indianapolis, IN, 2014, October.
Full version
Assessment Administrators Anonymous: 12 steps for involving faculty in assessment
Kinzie, J., & Lindsay, N.
AAC&U General Education & Assessment Conference, Portland, OR, 2014, February.
Participants will learn strategies for enhancing faculty engagement in assessment on their campuses, including approaches to overcoming barriers to faculty involvement and meaningful incentives for faculty engagement. Realizing the promise of assessment depends on growing and deepening faculty involvement. The need is particularly acute in the assessment of general education?the area of undergraduate education that can lack faculty ownership. This session will explore the dynamics of faculty involvement in assessment and identify twelve steps for increasing faculty engagement. Using effective approaches revealed in National Institute for Learning Outcomes Assessment (NILOA) case studies and practiced at institutions that have increased faculty involvement, presenters will spark audience discussion and encourage participants to consider approaches to apply on their own campuses.
Full version
Advancing assessment in student affairs: Emphasizing learning, creating partnerships, and using evidence to improve
Kinzie, J.
Assessment Institute, Indianapolis, IN, 2013, October.
Full version
Assessing involvement in faculty development
Nelson Laird, T., BrckaLorenz, A., & Peck, L.
Assessment Institute, Indianapolis, IN, 2013, October.
Full version
Stimulating dialogue and improvement in high-impact practices using new NSSE reports
Kinzie, J., & Ribera, A. K.
Assessment Institute, Indianapolis, IN, 2013, October.
Full version
Assessing student engagement with an updated NSSE: New possibilities
Kinzie, J.
NASPA Assessment & Persistence Conference, Denver, CO, 2013, June.
Full version
The NSSE report builder: An online tool for assessing student engagement
Fosnacht, K.
Association for Institutional Research Annual Forum, Long Beach, CA, 2013, May.
The National Survey of Student Engagement (NSSE) Report Builder is an online interactive tool that allows users to create custom reports derived from NSSE data by selecting from a variety of student and institutional characteristics.
The session demonstrates how the report builder can aid in the assessment of student engagement through its ability to compare both students and institutions. Participants learn how to analyze their institutions‘ NSSE data with the report
builder and to create their own personalized reports.
Accountability and learning: Integrating NSSE and outcomes assessment to inform student affairs practice
Kinzie, J., Cammarata, M., & Romano, C.
American College Personnel Association Annual Convention, Las Vegas, NV, 2013, March.
Full version
Feasible, scalable, and measurable: Information literacy assessment and the National Survey of Student Engagement
Boruff-Jones, P., Donovan, C., & Fosnacht, K.
American Library Association Annual Conference, Anaheim, CA, 2012, June.
Full version
Assessing engagement in the first year: Lessons from BCSSE and NSSE
Cole, J. S., & Kinzie, J.
Assessment Institute, Indianapolis, IN, 2011, October.
Full version
Faculty perceptions of institutional assessment and participation in classroom
Nelson Laird, T. F., Ribera, T., Shaw, M. D., Haywood, A., & Cole, E. R.
Assessment Institute, Indianapolis, IN, 2011, October.
Full version
Using NSSE data in accreditation and quality improvement plans
Kinzie, J.
Assessment Institute, Indianapolis, IN, 2011, October.
Full version
Linking institutional assessment and the scholarship of teaching and learning
Nelson Laird, T. F., McCormick, A. C., & Gale, R. A.
Association of American Colleges & Universities Annual Meeting, San Francisco, CA, 2011, January.
Full version
Capped off: Assessing college capstone courses
Kinzie, J., McCormick, A. C., & Nelson Laird, T. F.
Assessment Institute, Indianapolis, IN, 2010, October.
Full version
Developing civic engagement skills on college campuses: A multi-campus assessment
BrckaLorenz, A., Cervera, Y., Garver, A., & Sarraf, S.
Assessment Institute, Indianapolis, IN, 2010, October.
Full version
Disciplinary variation in the effects of teaching general education courses: Implications for assessment and faculty development
Nelson Laird, T. F., & Garver, A. K.
Assessment Institute, 2010, October.
Full version
Tracking the impact of assessment: Studying evidence-based improvement in colleges and universities
Kinzie, J.
Association for the Study of Higher Education Annual Conference, Vancouver, British Columbia, Canada, 2009, November.
Full version
Assessing and enhancing student engagement and success
Kinzie, J., & McCormick, A.
Assessment Institute, IUPUI, Indianapolis, IN, 2009, October.
Full version
Defining and using peer fields for inter- and intra-institutional assessment
Nelson-Laird, T. F., Shaw, M. D., Haywood, A. M., & McCormick, A. C.
Assessment Institute, IUPUI, Indianapolis, IN, 2009, October.
Full version
Using NSSE results to foster collaboration on assessment and retention
Kinzie, J.
NASPA International Assessment & Retention Conference, New Orleans, LA, 2009, June.
Full version
Mapping the assessment skills and knowledge (ASK) standards to NSSE
Buckley, J. A. & Kinzie, J.
American College Personnel Association Annual Convention, Washington, DC, 2009, March.
Full version
Assessing and strengthening general education using NSSE: Lessons from the field
Kinzie, J., Burney, J., Boon, R., & Jonson, J.
AAC&U General Education & Assessment Conference, Baltimore, MD, 2009, February.
Full version
How faculty chose to improve their teaching across disciplinary areas
Garver, A. K., & Nelson Laird, T. F.
Assessment Institute, IUPUI, Indianapolis, IN, 2008, October.
Full version
Using NSSE to assess and enhance student engagement and student success: Lessons from the field
Kinzie, J., & Pennipede, B.
Assessment Institute, IUPUI, Indianapolis, IN, 2008, October.
Full version
Assessing student engagement in high-impact practices
Kinzie, J., & Evenbeck, S.
NASPA International Assessment & Retention Conference, Scottsdale, AZ, 2008, June.
Full version
Enhancing student success: Using NSSE and BCSSE data to shape student engagement
Kinzie, J., & Matveev, A.
NASPA International Assessment & Retention Conference, Scottsdale, AZ, 2008, June.
Full version
Student affairs assessment: Using intentional strategies to promote student success
Niskod, A. S., Bureau, D., & Nelson Laird, T. F.
American College Personnel Association Annual Convention, Atlanta, GA, 2008, April.
Full version
Assessing general education learning outcomes: NSSE benchmarks and institutional practice
Kinzie, J.
AAC&U General Education & Assessment Conference, Boston, MA, 2008, February.
Full version
Effective educational practices and essential learning outcomes in general education courses: Differences by discipline
Nelson Laird, T. F., McCormick, A. C., & Chamberlain, T. A.
AAC&U General Education & Assessment Conference, Boston, MA, 2008, February.
Full version
Using NSSE to enhance student engagement and success: Lessons from the field
Kinzie, J., & Kuh, G. D.
Assessment Institute, IUPUI, Indianapolis, IN, 2007, November.
Full version
Bringing assessment results to the faculty
Nelson Laird, T. F., Kinzie, J., & Chamberlain, T. A.
Professional & Organizational Development Conference, Pittsburgh, PA, 2007, October.
Full version
A more comprehensive look at first year engagement: A longitudinal assessment approach
Gonyea, R. M., Nelson Laird, T. F., & Cole, J. S.
NASPA International Assessment & Retention Conference, Louis, MO, 2007, June.
Full version
Assessing faculty to better understand student engagement
Nelson Laird, T. F., & Garver, A.
NASPA International Assessment & Retention Conference, St. Louis, MO, 2007, June.
Full version
Engaging distance learners: Lessons learned from the National Survey of Student Engagement
Chen, P. D.
NASPA International Assessment & Retention Conference, St. Louis, MO, 2007, June.
Mind the gap: Assessing expectations and experiences in the first year of college
Kinzie, J.
First-Year Assessment Institute, Savannah, GA, 2007, June.
Full version
Now what? A facilitator's guide for using NSSE data
Kinzie, J.
NASPA International Assessment & Retention Conference, St. Louis, MO, 2007, June.
Full version
What you and your institution can do to promote change and student success
Kinzie, J., Schuh, J., & Whitt, E.
AAC&U General Education & Assessment Conference, 2007, March.
Full version
Getting faculty involved in the student engagement conversation: The faculty survey of student engagement
Nelson Laird, T. F., Buckley, J., & Palmer, M.
Assessment Institute, IUPUI, Indianapolis, IN, 2006, October.
Nonresponse effect in large-scale student survey: Lessons learned from the National Survey of Student Engagement
Chen, P. D.
NASPA International Assessment & Retention Conference, Phoenix, AZ, 2006, June.
Taking stock of what matters to students' success
Kuh, G. D.
NASPA International Assessment & Retention Conference, Atlanta, GA, 2005, June.
Full version
Using NSSE results to chart new territory in institutional assessment and educational effectiveness
Kinzie, J., & Springer, R.
NASPA International Assessment & Retention Conference, Atlanta, GA, 2005, June.
Full version
Using NSSE to understand students' experiences: Making the most of data in assessment
Kinzie, J.
SUNY Assessment Initiative, 2005, April.
Full version
Deep assessments of student success and educational effectiveness
Kuh, G. D., & Kinzie, J.
AAHE National Conference on Higher Education, Atlanta, GA, 2005, March.
Living-learning programs and residential colleges
Kuh, G. D.
Living-Learning Programs and Residential Colleges, IUPUI, Indianapolis, IN, 2004, November.
Using student engagement data for improvement and accountability
Kuh, G. D., & Gonyea, R. M.
Assessment Institute, IUPUI, Indianapolis, IN, 2004, October.
Learning from NSSE: An approach to assessing and improving the first year experience
Kinzie, J.
Summer Institute on First-Year Assessment, Asheville, NC, 2003, July.
Full version
Courageous questions and reflective practice: Approaches to academic assessment that are do-able and worth doing
Barger, A., Dillingham, A. E., Gormly, A. V., Kinzie, J., Kuh, G. D., & Rawlings, D. H.
Association of American Colleges & Universities Annual Meeting, 2003, January.
National Survey of Student Engagement
Hayek, J.
Policy Center on the First Year of College Summer Assessment Institute, Asheville, NC, 2002, July.
NSSE: Participants, potential users, and colleagues
Kuh, G. D.
American Association for Higher Education Assessment Forum, Boston, MA, 2002, June.
Using results from the National Survey of Student Engagement for institutional improvement
Bresciani, M., Kuh, G., & Troxel, W.
American Association of Higher Education Assessment Forum, Boston, MA, 2002, June.
The general education we design and the general education students experience: Confronting questions of quality in student engagement and accomplishment
Kuh, G. D.
AAC&U General Education & Assessment Conference, Dallas, TX, 2002, February.
Assessment in practice: Implementing National Survey of Student Engagement results on campus to enhance the first-year experience
Bleak, C., Huber, B., & O'Day, P.
First-Year Experience Conference?West, San Francisco, CA, 2002, January.
Assessing first-year students and programs: No silver bullet
Swing, R.
, 2001, November.
NSSE/CSEQ luncheon presentation
Gonyea, R.
NASPA Assessment & Research Workshop for Student Affairs Professionals, Providence, RI, 2001, October.
AAHE research forum: Building an agenda for the scholarship of assessment
Kuh, G. D.
American Association for Higher Education Assessment Conference, Denver, CO, 2001, June.
Using NSSE to improve effective educational practice
Kuh, G. D.
American Association for Higher Education Assessment Conference, Denver, CO, 2001, June.
Using results from the National Survey of Student Engagement for assessment and institutional improvement
Banta, T., Kuh, G., Pike, G., & Smith, E.
American Association for Higher Education Assessment Conference, Denver, CO, 2001, June.
Using student engagement data to foster collaboration on assessment
Gonyea, R., & Kuh, G.
American Association for Higher Education Assessment Conference, Denver, CO, 2001, June.
Assessing student engagement in the first-year of college integrating surveys and using findings to inform policy decisions
Gonyea, R., & Ouimet, J. A.
First-Year Assessment National Forum, Houston, TX, 2001, February.
The National Survey of Student Engagement: An update on NSSE 2000
Kuh, G., Ewell, P., Hayek, J., Gonyea, R., & Muffo, J.
American Association of Higher Education Assessment Conference, Charlotte, NC, 2000, June.
What does the future hold for assessment?
Kuh, G. D.
Assessment Institute, IUPUI, Indianapolis, IN, 1999, November.
The National Survey of Student Engagement
Kuh, G. D.
American Association for Higher Education Assessment Conference, Denver, CO, 1999, June.
Assessment in practice: Implementing National Survey of Student Engagement results on campus
George Kuh (NSSE), Debbie Heida (Wittenberg University), Patrick O'Day (NSSE), George Wallman (North Dakota State University)
National Association of Student Personnel Administrators (NASPA) 2002 National Conference, Boston, MA.
Getting faculty involved in the student engagement conversation: The faculty survey of student engagement
Nelson Laird, T. F., Johnson, S. D., Schwarz, M. J., & Niskod, A.
Assessment Institute, Indianapolis, IN.
Full version
Annual Results
Enhancing the Quality of High-Impact Practices at Middle Georgia State University
In Engagement Insights: Survey Findings on the Quality of Undergraduate Education—Annual results 2018, 11.
Full version
Use of Rubrics Common Among Faculty
In Engagement insights: Survey findings on the quality of undergraduate education—Annual results 2016, 11.
Full version
BCSSE and FSSE
In Assessment for improvement: Tracking student engagement over time—Annual results 2009, 21 - 22.
Full version
Webinars
Using NSSE Data in Diversity, Equity, and Inclusion Assessment Practices
Christen Priddie and Cindy Ann Kilgo
April 19, 2022.
Recording
Your NSSE Institutional Report 2021: Step-by-Step
Bob Gonyea and Jillian Kinzie
September 23, 2021.
Recording
Highlighting NSSE 2021: New Offerings in a Year Like No Other
Alexander C. McCormick, Jillian Kinzie and Jennifer Brooks
September 30, 2020.
Recording
Learning & Assessment in Student Affairs: The NSSE Way
Dajanae Palmer, Project Associate & Samantha Silberstein, Project Associate
April 30, 2019.
Recording
Using NSSE Data in Student Affairs
Jillian Kinzie, Associate Director of the Center of Postsecondary Research, and Sarah Hurtado, Project Associate NSSE Institute.
December 1, 2016.
Recording
Using NSSE data in accreditation
Jillian Kinzie, NSSE Institute Associate Director
November 8, 2011.
Recording
Using NSSE data in student affairs
Jillian Kinzie, Associate Director, NSSE Institute, and Tony Ribera, NSSE Institute Project Associate
March 24, 2011.
Recording
Improving student response rates
Tony Ribera and Brian McGowan
January 25, 2011.
Recording
Program and department level assessment
Jillian Kinzie
May 11, 2010.
Recording
Ideas for encouraging student participation in NSSE
Tiffani Butler and Tony Ribera
January 26, 2010.
Recording
Driving data down: Using NSSE results in department, school, and major-level assessment activities
Alex McCormick and Allison BrckaLorenz
September 15, 2009.
Recording
Using high-impact activities to maximize student gains
Todd Chamberlain
June 23, 2009.
Recording