Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

December 4, 2014

Teaching@Pitt: Assessing students’ higher level thinking


Designing an exam that lets students demonstrate critical thinking and that you can grade within a reasonable amount of time can be challenging. Critical thinking, also referred to as higher level thinking, involves skills that students are likely to transfer beyond the classroom, such as analyzing, drawing conclusions or defending arguments.

Many faculty choose to use essay exams to measure critical thinking skills because they assume that students can better convey complex ideas in writing. There is a common assumption that multiple-choice tests that can be graded by a Scantron machine are more appropriate when measuring lower level skills that require students to demonstrate memorization or only a basic understanding.

However, multiple-choice exams can provide a viable alternative for measuring higher level thinking skills. In fact, they have some advantages over essay exams. The fatigue that results from grading numerous essays makes it difficult for instructors to consistently evaluate each one. While the use of rubrics can help make the process of grading written exams more efficient, the sheer volume of exams still can be overwhelming. As an example, a humanities instructor confided that she only grades one question from each student of the several end-of-term essays assigned because she cannot possibly tackle all of the work. At the same time she insisted that written exams are the only way to determine if her students had acquired the skills she had planned.

Grading fatigue is compounded when students are not prepared to respond to the essay questions. Most instructors who have assigned essays are familiar with the dreaded “brain dump” that occurs when students spew every idea that occurs to them onto a page, hoping that some morsel of information will hit a target for partial credit.

Another weakness of essays is that it can be difficult to differentiate between the quality of writing and the level of thinking of the writer. For example, a faculty member in the social sciences enjoyed organizing each lecture as a story, presenting the main concepts and then rising to a crescendo while emphasizing the difficult choices that players made during the height of political drama. His objective during exams was to place students in similar scenarios and ask them to justify the choices that they would make. Instead, close reading indicated that many students merely eloquently parroted the decisions of the actors from the class examples, demonstrating a basic understanding of the ideas but not at the critical level expected.

An alternative to essay and basic multiple-choice exams is a blended approach using primarily multiple-choice questions with a few short-answer questions to challenge students to demonstrate higher thinking. The exams take more time to develop, but are more efficient to grade. Here are steps you can use to develop this type of a test:

1. Develop a case, scenario, chart or problem that is different from those presented in class, but requires use of the same skills to resolve.

2. Prepare multiple-choice questions that address the skills leading up to the resolution of the problem. These questions should assess whether or not your students have a basic understanding of the issues, concepts or process in order to make further decisions. The incorrect answers that serve as distractors can be errors in thinking typically displayed by students. By providing these lower level questions, it is easier to assess whether students can demonstrate the prerequisite knowledge to address the problem.

3. Present plausible options for the final question; those choices should reflect the type of responses that students might argue if the question was presented in class. Tell students to select the best answer.

After the students have chosen their answer to the last question, ask them to justify in three sentences why they chose the answer that they did, or explain why that approach is better than the other options. Asking students to explain their reasoning and limiting the number of sentences requires them to summarize rather complex ideas, and eliminates the chance that they will randomly select the correct response.

The short explanations from students are a key component of this approach to assessment. Reading students’ explanations will give you a solid understanding of their ability to think critically about your topic. Compared to the amount of time required to grade essays, this blended approach to exams can significantly reduce the time you spend grading.

Carol Washburn is a senior instructional designer at CIDDE.