“In this article, we describe an easily adoptable and adaptable model for a one-credit capstone course that we designed to assess goals at the programmatic and institutional levels.” (p. 523) That’s what the authors claim in the article referenced below, and that’s what they deliver. The capstone course they write about is the culmination of a degree in political science at a public university.
The course is designed to assess the acquisition of three skills—critical thinking, written communication, and oral communication—across the 10 courses that make up the political science major. The course also aspires to “expose students to a holistic review of political science as a discipline, reviewing the broader themes that link the various subfields together; and allow students to reflect on their experience in the major and consider future applications of the major’s themes and skills to a variety of civic and professional contexts.” (p. 524)
A variety of innovative assignments are used to accomplish these goals. The primary activity is a simulated academic conference. Students select a paper they have written for one of their political science courses and prepare it for presentation in this course. The instructor organizes the papers into panels that are presented during four of the eight weeks of the course. The instructor chairs these panels and facilitates a wide-ranging discussion of the papers. The goal of the discussion is to raise questions that pertain to central issues within the field such as power, citizenship, accountability, and legitimacy. The papers are also used to assess critical thinking and writing skills. Each paper is assessed by the instructor and two randomly selected students. All three reviewers use a detailed rubric contained in the article. The instructor uses another rubric (also contained in the article) to assess the oral communication skills displayed in this presentation and in the learning-through-teaching activity described below.
Three other activities contribute more assessment data. In a course mapping exercise students rate each of the 10 major courses in terms of how well they enhanced the four key learning goals expressed in the departmental mission statement: critical thinking, written and oral communication, and understanding the discipline. Students also complete an open-ended exit survey “that asks them to anonymously and candidly evaluate the strengths and weaknesses of the program and faculty, and to make recommendations for future development.” (p. 525)
Finally, students complete a learning-through-teaching activity. For this activity, a pair of students makes a 30-minute presentation and facilitates a discussion of it with groups of about 10 beginning students enrolled in a large 100-level American government course. Students may do the presentation on a topic of their choosing, but it must contain substantive content and engage students in discussion.
Besides describing the course design, the authors also share the assessment results produced and in doing so demonstrate what valuable assessment data a course like this can produce. For example, when assessing students’ critical thinking, writing and oral communication skills, the communication skills were consistently lower than departmental expectations. In reviewing course mapping data, they discovered that students perceived only two of their major courses as enhancing their communication skills.
The authors write candidly: “The results from assessment through the capstone have illuminated both programmatic strengths and weaknesses. Maintaining the status quo on strengths is an easy task. However, taking action to address the weaknesses is a more significant undertaking.” (p. 527) To redress the oral communication deficiency, faculty members agreed to include more oral exercises in their courses, although content and class size make this difficult. It was this feedback that encouraged the development and implementation of the learning-through-teaching activity in the capstone. Departmental faculty also decided to piggyback onto a recent university general education requirement for a public speaking course.
There has been considerable faculty resistance to programmatic assessment. How will the data be collected and how will it be used? Those concerns are legitimate, but an article like this shows that collection of data can be done using viable processes and the data collected can be used to benefit students, faculty, and the program. “Using results generated by the capstone, our department is building a culture of assessment that facilitates across-the-board programmatic enhancement and boosts student learning opportunities.” (p. 528) As the authors note in the opening quote, this is an adaptable and adoptable course design model—and, we would add, not just for political science degree programs.
Reference: Sum, P. E., and Light, S. A. (2010). Assessing student learning outcomes and documenting success through a capstone course. PS: Political Science and Politics, 43 (3), 523-531.
Excerpted from The Teaching Professor, 25.6 (2011): 7.
This Post Has 0 Comments
Thanks for the "learning-through-teaching activity" info.
Great article. Thanks.