Helping Students Who Are Performing Poorly

Unfortunately, all too often performance on the first exam predicts performance throughout the course, especially for those students who do poorly on the first test. Faculty and institutions provide an array of supports for these students, including review sessions, time with tutors, more practice problems, and extra office hours, but it always seems it’s the students who are doing well who take advantage of these extra learning opportunities. How to help the students who need the help is a challenging proposition.

But here’s an intervention (analyzed empirically) that did have a significant effect on the performance of students who did poorly on the first exam in two different courses. The courses were an introductory physics course, part of an engineering physics program, and an introductory oceanography survey course. Students in the first course who scored in the bottom quartile and those who failed the first exam in oceanography received a personalized email from the instructor if they self-reported that they had studied more than the class median (six hours) and still failed the exam. The assumption was that this cohort had used inefficient study strategies and could benefit the most from the intervention. An email sent to part of the cohort indicated that the instructors were concerned about their performance and would like to meet with them. A second email was sent to the rest of cohort, again indicating the instructor’s concern and containing specific study advice but no invitation to meet.

For the students who came to see the instructor (and not all of them did), the 15- to 25-minute meeting started with a discussion of how they had studied for the exam. Most said they tried to memorize everything because they figured that it was all important. “Many students have difficulty figuring out what’s important to learn, particularly in an unfamiliar domain, and their interpretation of what’s important differs from the instructors’ views.” (p. 82)

“Our intervention encouraged students to take a few specific actions dealing with deeper processing of the course material through self-testing, particularly as defined by the specific course learning goals.” (p. 77) They showed students how the exam questions were based on course goals and proposed that they should actively test themselves on the goals relevant to the content to be covered on the next exam. They should develop their own understanding of the goals, coming up with explanations that made sense to them. They should match clicker questions, practice questions, and homework problems to the goals and respond to the question or do the problems before looking at answers. (These approaches to studying had been recommended to all students prior to the first exam. And this was the study advice contained in the email sent to students who were not invited to meet with the professor.)

“Both the meeting and email intervention groups significantly increased their average scores on Midterm 2 over Midterm 1 compared with those not in the intervention groups.” (p. 79) For some students the amount of improvement was “remarkable,” with one student improving from 49 percent to 91 percent and another from 16 percent to 80 percent. But maybe the students would have improved anyway. The first exam was their wake-up call, or they learned how to take the instructors’ exams. A regression analysis revealed that “receiving an email from an instructor … produced the same increase in exam performance as simply knowing one had failed the first exam.” (p. 80) However, the meeting intervention described above produced a greater gain than might have been expected simply because of regression toward the mean.

The researchers also surveyed these students—online in the case of the oceanography course and with individual interviews in the physics course. Eighty percent of the meeting intervention students in oceanography reported changing their study habits “a lot” compared with 25 percent of the email students and 15 percent of the nonintervention students. Seventy percent of the physics students reported that they were testing themselves, targeting the learning goals, consulting the book, and attending review sessions. Of special note was the fact that students in oceanography reported slightly less study time for the second exam (10 hours instead of 11). “These results imply that ‘studying harder’ was not generally responsible for the improvement shown by meeting intervention students.” (p. 81)

“We argue that the intervention described here worked largely because, in both these courses, serious effort … was made to establish clear and explicit learning goals and to ensure that all course elements (class time, homework, exams) were well aligned with these goals.” (p. 82) The numbers of students involved in this study was small, but even so it’s an interesting model that made a significant difference for some of the students most in need of help.

Reference: Deslauriers, L.; Harris, S. E.; Lane, E.; and Wieman, C. E. (2012). Transforming the lowest-performing students: An intervention that worked. Journal of College Science Teaching, 41(6), 76-84.

Reprinted from The Teaching Professor, 27.10 (2013): 4. © Magna Publications. All rights reserved.

Leave a Reply