Flipping Assessment: Making Assessment a Learning Experience

If you’re a regular reader of this blog, you’re already aware that flipped instruction has become the latest trend in higher education classrooms. And for good reason. As it was first articulated by Bergmann and Sams, flipped instruction personalizes education by “redirecting attention away from the teacher and putting attention on the learner and learning.” As it has evolved, the idea of flipped instruction has moved beyond alternative information delivery to strategies for engaging students in higher-level learning outcomes. Instead of one-way communication, instructors use collaborative learning strategies and push passive students to become problem solvers by synthesizing information instead of merely receiving it. More recently on this blog, Honeycutt and Garrett referred to the FLIP as “Focusing on your Learners by Involving them in the Process” of learning during class, and Honeycutt has even developed assessments appropriate for flipped instruction. What’s been left out of the conversation about flipped classrooms, however, is why and how we might also need to flip assessment practices themselves.

The bottom line in flipped instruction is actively engaging students in higher-level learning during class. Although many instructors see assessment as a separate part of the learning cycle—a part that doesn’t typically involve students—there are ways to shift the focus of assessment from the instructor to the student as well as involve students in the process, thereby flipping assessment by making it a learning strategy. Here are a few suggestions for flipping assessments:

  • Create assignment/course rubrics with students. This strategy allows students to provide input on the standards by which they will be graded as well as promotes a deeper understanding of what the standards mean. Instructors and students involved in the discussion during the co-creation of rubrics standardize their concept of quality work, giving students a clearer understanding of what they are being asked to do and the level at which they should be performing. Inclusion in the creation of rubrics also motivates students to participate more fully in the learning process.
  • Have students fill in evidence of learning on their assignment/course rubric. Give students a modified rubric with the articulation for the highest achievement level and leave a blank space for them to write in. This flipped assessment strategy enables students to reflect on their learning and take an active role in the grading process by directing the instructor’s attention to their achievements. Instead of passively “receiving” a grade, students actively guide the instructor in assessing their work in a particular context, one that the students articulate for the instructor. This method, coupled with the last, allows students to participate in authentic assessment situations that they might face in job performance assessments as current or future employees.
  • Grade with students in grading conferences. This unconventional strategy, much like flipped classrooms, actively engages students in learning during assessment. Having students sit with instructors while they grade takes the mystery out of how assignments are assessed, and it enables students to actively question, clarify, and understand why they are assigned their grades. Their involvement in the grading process also allows instructors to see where students misunderstand points on the rubrics or in classroom instruction. Grading becomes a collaborative activity where learning by both instructor and student also takes place, unlike the one-way communication situation inherent in conventional grading situations.

When I have used flipped assessments in my writing courses, students have responded positively. After participating in grading conferences, students reported that this grading experience was more personal, important, and valued, and that they felt more confident in revising their work. Students also felt that the grading standards were clear and fair as a result of co-creating and discussing the course rubric.

Students engaged in these flipped assessment strategies are reflective learners who generate evidence for their own assessments. They can take charge of how and why they learn, a major tenet of flipped instruction itself, or at least have a voice in that process. In this way, the energy of assessing their work shifts away from the instructor and toward the students, enhancing their learning in the process. Flipped assessment features a collaborative process where information flows between students and instructors instead of only one way. Finally, students are involved in the full process of learning, including the integral element of assessment, by their synthesis of standards and analysis of their own work. This is a powerful moment where pedagogy and personal/professional practices come together. When we flip our classrooms to be more focused on student learning and student goals, and when we consequently flip our assessment practices to foster agency in our students and help them develop the skills they need for providing evidence of their learning, then we’re mentoring them; we’re walking them through the processes that we, as teachers, need to enact daily.

References
Bergmann, Jonathan, and Aaron Sams. Flip Your Classroom Reach Every Student in Every Class Every Day. Eugene, Or.: International Society for Technology in Education, 2012.

Barbi, Honeycutt, and Jennifer Garrett. “Expanding the Definition of a Flipped Learning Environment.” Faculty Focus. January 31, 2014. Accessed April 8, 2015. https://qa.facultyfocus.com/articles/instructional-design/expanding-definition-flipped-learning-environment/

Susan Spangler is an associate professor of English at the State University of New York at Fredonia.

This Post Has 13 Comments

  1. Jaff Lawrence

    Great. I believe that if well implemented flip instruction can reduce the massive failure that we are experience in Mathematics.

    Susan, thank you so much for this article.

  2. Arturo Vazquez

    Dr. Spangler, Hi Susan,
    Do we have examples from difference levels freshman Psychology 100 to a 200 level sophomores, it appears that at least in my experience and on the eyes of other coulees at this level we should have more control groups to prove that this is really good for all students, please, if you think that I will be able to do this in my classroom show me the examples and the that it really does what it suppose to do, other wise I could be call a flipper! with out the facts. Remember you are an authority in the field and I trust you that what you are saying about assessment and flipping assessment in the flip classroom really works, there should be some examples, actual results, studies research designed with control groups and all the good science like psychology, I trust that you know where I am coming from, but just in case my dean ask what are you doing? in the flip classroom and how do you know that what I am doing its really working or are they (students) really getting it compelling truth and evidence facts will convince my dean and of course my coulees that say that is a bush of BS. Thanks in advance, super skeptical to the next level first year professor of Psychology!

    1. Susan Spangler

      Hi Arturo.
      I used these methods with student in a first-year writing course and with a teaching methods course. My IRB did not allow for a control group, but I think you could certainly handle different sections differently. I used anonymous surveys to learn what students were thinking about these methods, and I trust that they told me the truth when they said that they learned a lot from these assessment methods. This is a "high impact" strategy, something that your dean may be looking for. You should read more about flipping your classroom to be ready for discussions with administrators.

  3. Len Olszewski

    I'm interested in the maximum class size this approach would work in. How much time would be necessary for each student to accomplish the goals of this methodology? Do you apply this to all students in a class, or just those needing the most help?

    1. Susan Spangler

      Hi Len.
      Our writing courses are capped at 20, and that's when I have graded with students the most. Co-creating rubrics could work with any size class, as well as having students fill out their evidence on rubrics. The methods help all students in the class. If you want to try grading with students on a volunteer basis, that's certainly worth a try, or you might grade with everyone one time and then see if anyone wants to continue.

  4. tengrrl

    Hi, Susan,
    My question is like Len's. I'm never sure how to scale the class creation of rubrics. I have enough trouble getting feedback to students without every class having a different rubric. How do you make this idea work on a 4/4 load?
    traci

  5. Susan Spangler

    Hi Traci.
    Great question. If you are teaching the same writing course, then I would probably give students a skeleton of the rubric with the course goals already outlined and then let them fill in the gaps. If you used a Google Doc, then everyone could have input from multiple sections of the course to create a common rubric. If you're teaching different courses, then you probably have slightly different rubrics because you have slightly different course goals. And of course you don't have to use all of these methods for every class. I would be interested in hearing how people experiment with trying different flipped assessment methods with different courses to see how the results compare.

    1. tengrrl

      These ideas make sense. I was thinking of multiple sections of the same course. I don't see myself teaching three sections of tech writing with three different assessment models. I'd be constantly confused. Starting with some basics that everyone negotiates could work though. I like the grading conferences idea the most. I had been heading in that direction, after doing mini-conferences for two terms now as students work on their drafts. Grading conferences seem like the obvious next step.

  6. Jackie Cesnik

    This is an interesting article;
    When I started training and assessment we had the time to individually meet with students to show them their assessments and discuss the results; this enabled us to identify the strengths and weaknesses of students and know where their learning gaps were.
    Unfortunately as a contract trainer my casualised employment framework does not allow for this. I think this contributes to the tick and flick form of training and does not serve students well in their learning path

Leave a Reply