Student-Written Exams Increase Student Involvement

What? Students writing their own exams? Yes, that’s exactly what these marketing faculty members had their students do. “The Student-Written Exam method is an open book and notes take-home exam in which each student writes and answers his or her own multiple-choice and short essay questions.” (p. 32)

It’s an interesting idea that arose out of the authors’ desires to increase student involvement in learning and self-evaluation, minimize cheating, decrease exam stress, and make exam experiences more meaningful, among other goals. It’s an approach that can be used online or in class.

Even though students take all sorts of exams and quizzes across their college careers, most never get the chance to write an exam, and, as these authors point out, they need support in order to do so. They have prepared a detailed set of exam guidelines to accompany this exam writing experience (the article includes a Web address for these guidelines). The guidelines clearly identify what content is to be covered by the student exam. They list chapter learning objectives to help the students create questions on important content. They offer advice on writing multiple-choice and short essay questions and illustrate the advice with examples. They share Bloom’s taxonomy and encourage students to write challenging questions, again illustrating with examples. They include all the logistical details such as when the exam is due and how it should be formatted. And they share their grading rubric. In addition to the guidelines, they offer to review a multiple-choice and short answer question when students start writing them.

This approach has one other unique feature: an exam feedback session. Students come to it with their completed exam and, on a separate sheet, one short-answer question and answer. For an hour, student questions and answers are discussed. At the end of that session, students have 10 minutes to review and makes changes to their answers.

Exams are graded on how well the set of questions covers the chapter learning objectives, how challenging the questions are, and the accuracy of the answers. In their classes of 25 to 35 students, the faculty did not find that the approach increased their exam grading time. And there was one unexpected grading benefit. “Our experience with this method also showed that it was less tedious to grade different questions than to read and grade the same answers multiple times.” (p. 34)

In addition to being an evaluation experience through which students learned the content, the authors report that the approach accomplishes several other learning goals. It encourages students to take responsibility for their own learning and evaluation. It allows students to word questions and answers in ways that are meaningful to them. Writing questions and answers causes students to engage with the content in deeper ways. Students reported that this assessment experience was less stressful, and it’s an approach that pretty well takes care of the cheating problem.

One of the biggest challenges of the approach is helping students learn how to write good questions. They don’t always see good examples on the exams they regularly take. And writing good test questions is hard, even for teachers. But here too, there is significant learning potential. Students are really being taught how to ask good questions, and that is an invaluable skill.

For instructors, the challenge is losing control over the difficulty and content of the questions students ask on their exams. Despite the supports provided, students didn’t always write challenging questions or cover all the topics the teachers felt should be covered. However, the trade-off was increased motivation. “It forced me to take more action and initiative while studying,” one student wrote. “Thinking of questions was a different way of learning.” (p. 34) The authors also note that they found it informative to see what questions and problems students decided to put on their exams.

We tend to get stuck in ruts and narrow thinking when it comes to how we assess student knowledge. We have our favorite assessment approaches, which we use regularly. An exam alternative like this illustrates the viability of ideas that we may not have considered.

Reference: Corrigan, H., and Craciun, G. (2013). Asking the right questions: Using Student-Written Exams as an innovative approach to learning and evaluation. Marketing Education Review, 23 (1), 31-35.

Reprinted from The Teaching Professor, 28.4 (2014): 3. © Magna Publications. All rights reserved.

This Post Has 3 Comments

  1. Bernard Gauthier

    Many years ago, an undergraduate course I took gave us the opportunity to create an exam, much as this article describes. I remember the time I put into reviewing the materials and developing two questions that would allow me to summarize it all. I was proud of those questions (and the answers I developed) in a way that few other courses or exam ever matched. That I remember this episode so clearly some 30 years later is also evidence of the impact this had.

  2. Melissa Hudler

    I really like this idea, but I wonder how it would work in a literature course, where much student learning has to be showcased via essays because of the subjective and interpretive nature of literary analysis. I can imagine that the use of critical thinking would increase, as it would be more challenging for students to create complex questions based on subjective material. The caveat about having to teach students how to formulate good questions reminded me of the Jeopardy style tests I gave when I taught high school, in which I provided the answers and they had to come up with the correct question. This might be an effective way to ease students into this activity: give a Jeorpardy style test first, and then analyze their answers (which would be questions) for elements of effective and ineffective test questions. Hmmm. . . might have to give it a try!

Leave a Reply