High impact practices (HIPs) have been in higher education for nearly two decades and include practices that are effective for a wide range of students, particularly students from historically underrepresented groups. HIPs help first-year students engage in deep-level approaches to learning that allow for increased retention, integration, and transfer of knowledge (Kuh, 2008). These varied practices can take on many forms based on student body characteristics and the priorities and strategic initiatives of the university (Kezar & Holcombe, 2017). HIPs can be embedded throughout course work to support student learning and engagement by enhancing existing instructional practices; however, universities must continually and strategically analyze their use of these practices to ensure that ‘impact’ stays at the forefront.
Ten ‘high impact’ practices have been identified by the Association of American Colleges and Universities (AAC&U). For the most positive impact on students, universities should work to continuously define, refine, and scale what and how HIPs are implemented. Implementation science can support this process and allows the refinement of processes, procedures, and/or other conditions that promote or prohibit the transfer, adoption, and use of these specific practices (McKay, 2017). The ‘Plan-Do-Study-Act’ (PDSA) model is one example of an implementation science inquiry cycle that can be used which allows collected data sources of HIPs across coursework, to note patterns and trends that identify strengths and areas for improvement (see Figure 1). This cycle is intended for university leaders to think systematically and allows for informed decisions based on data gathered from testing a change identified by specific goals (Blase et al., 2011; Carnegie Foundation for Advancement of Teaching, n.d). Understanding how these processes intertwine and navigate plays a vital role in the rigor and successful implementation of HIPs and informs the efficacy of their practice (Peterson-Ahmad, et al., 2021).
As Institutions of Higher Education (IHEs) utilize the PDSA cycle to continuously improve courses and their use of HIPs, strategic alignment that matches students’ individualized needs remains at the forefront. Intentional data collection is a critical factor in planning for intentional use of HIPs, as the findings of the PDSA cycle allow for informed decisions to be made with fidelity, based on data gathered (Shakman et al., 2017). Examples of potential data sources are shown below in Table 1.
High Impact Practice (HIP) | Example of HIP in university setting | Data sources collected for the PDSA process |
First-year student experience | Enrolling first-year students in seminars designed to navigate college life. | First-year student surveys related to their college experience. |
Common intellectual experiences | Organizing core classes with a common theme that provides experiences for course integration and application. | Focus group, discussion-based data involving students and faculty. |
Learning communities | Cohorts of students that allow increased exploration of topics from different disciplines. | Student assignment showcasing their learning from an assigned topic/reading from the semester. |
Writing intensive courses | Engaging students in a key writing assignment. | Student writing assignments that gain insight on levels of writing proficiency. |
Collaborative assignments/projects | Integrate collaborative student groups. | Student reflections on the experience of working collaboratively. |
Undergraduate research | Engage students in the research process with a faculty member. | Showcase student research presentations where they receive feedback from other students and faculty. |
Diversity/global learning | Allow students to explore different cultures and world views. | Student written assignments, student-led research, or student reflections. |
Service learning or community-based learning | Develop course activities and engage students with community partners. | Student created community-based projects that are scored by faculty based on the course’s learning objectives and rubric. |
Capstone course project | Students complete a culminating project that applies the knowledge and skills learned in the course and/or program. | Grade-based rubrics that define proficiency levels of course projects based on the course’s learning objectives and rubric, and/or student presentation of the final capstone course project. |
ePortfolios | Collection of student work across time in an electronic format. | Technology proficiency scale and/or grade-based rubrics that define proficiency levels based on the course’s learning objectives and rubric. |
Fidelity of programs (i.e., HIPs) must be monitored for accuracy and consistency of delivery to ensure that components are provided in a reliable manner across settings and individuals. Using the suggested data collection measures as identified above can support fidelity, however, multiple data points should be used so that a thorough and accurate assessment can be made to continuously improve the implementation and impact of HIPs. These types of data points drive the decision-making process and ensure that HIPs are always serving the specific needs of the students in the most impactful way.
Dr. Maria B. Peterson-Ahmad is an associate professor in the Teacher Education Department at Texas Woman’s University. She has numerous years of experience in K-12 public school settings teaching students in general and special education. Her research focuses on building strategic systems of teacher effectiveness, instructional coaching, technology in teaching, and the inclusion of students with mild to moderate disabilities in the general education inclusive classroom setting.
Dr. Toni Franklin is an associate professor of special education at the University of West Georgia. She spent several years teaching in K-12 schools in general and special education. Her research agenda focuses on the preparation of teachers to provide effective instruction and support to diverse students in an inclusive setting.
Angelica Addo, MEd, is a special education doctoral candidate at Texas Woman’s University. She has years of teaching experiences in K-12 general and special education public school settings. Her research interests are in student centered intervention strategies for students with disabilities and implementation of interventions with fidelity.
References
Blase, K.A., Fixsen, D.L., & Duda, M. (2011). Implementation science: Building the bridge between science and practice. [PowerPoint slides]. https://fpg.unc.edu/sites/fpg.unc.edu/files/resources/presentations-and-webinars/FPG-Blase-Fixen-Duda-Implementation-Science-02-02-2011.pdf
Carnegie Foundation for Advancement of Teaching (n.d.). PDSA (Plan-Do-Study-Act). Retrieved November 1, 2023.
Felton, P. & Clayton, P.H. (2011). Service-learning. New Directions for Teaching & Learning, 2011(128), p. 75084. https://doi.org/10.1002/tl.470
Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. Report from the Association of American Colleges and Universities.
McKay, S. (2017). Language learning defined by time and place: A framework for next generation designs. (In J.E. Diaz-Vera, Ed.). Left to My Own Devices: Learner Autonomy and Mobile Assisted Language Learning. Innovation and Leadership in English Language Teaching. Emerald Group Publishing Limited.
Peterson-Ahmad, M.B., Keeley, R., & Floyd, K. (2021). Translating special education research to practice. In B. Hott, F. Brigham, & Peltier, C. (Eds.), Research Methods in Special Education (pp. 277-291). Slack.
Shakman, K., Bailey, N., & Breslow, J. (2017). A primer for continuous improvement in schools and districts [White Paper]. Carnegie Foundation for the Advancement of Teaching. https://www.edc.org/sites/default/files/uploads/primer_for_continuous_improvement.pdf