By MARTY LEVINE
An impending finals week like no other has given Pitt faculty members the chance to step back and reassess the shape and purpose of their final exams.
Mike Bridges, director of the University Center for Teaching and Learning’s Teaching Commons, hopes that, given the chance, faculty have considered moving away from single, high-stakes final exams that account for a majority of the class grade.
Instead of placing the emphasis on getting students to repeat raw information they have absorbed across a semester (or on the night before the test), exams are best when they concentrate on the application, analysis and synthesis of information, he suggests.
The center offers a number of alternative test-formulation ideas for remote assessments, including:
Open-book exams. Consider allowing the use of notes or open books, since real-world applications of knowledge depend on “knowing what resources to use and in what circumstances,” he says, while basic information, such as statistics and formulas, are readily available to everyone on the Internet anyway.
Since asking for essay answers can be impractical for a large course, consider asking students for brief written answers concerning lessons imparted by the class, which demonstrate their learning process.
Require students to explain how they arrived at the answer of one of the test questions — the one they felt was the toughest.
Ask students to explain which question on the test was most applicable to the career they intend to pursue.
Include a question with an erroneous answer and request students to explain where the process of arriving at this answer may have gone wrong, and how to fix it.
Tell students to explain how some part of the course changed their way of thinking about the course’s subject.
Ask students to take an exam question and explain how the knowledge it requires applies to practical work in the field.
Maintaining academic integrity
“How do we administer exams and do it in an online environment and do it with rigor and academic integrity?” Bridges asks.
Since the pandemic sent universities all over the country into online-only classrooms in the spring, units, departments and individual faculty members at Pitt have been bombarded by sales pitches from companies selling online test-proctoring software and services, he says.
Remote-proctoring options can mean providing identity verification, from matching test takers’ faces to their Pitt ID cards and using facial recognition software to employing keystroke algorithms to match the typing pattern taken from students early in class to those used on testing day. “The claim is, it’s almost like a fingerprint,” Bridges says.
Other remote-proctoring companies offer to:
Lock down students’ browsers to keep them from using the Internet during a test.
Record students’ keystrokes to make sure they aren’t emailing with a friend for some answers.
Record all the activity on their desktops to see if chats or Internet research are happening during a test.
Use a webcam to scan the student’s environment in 360 degrees to see, say, a board on the wall where information is posted or notes and books open on a desk.
Record the test-taking session to detect eye and lip movements that may indicate communication with someone else during the test.
Some companies even offer live proctoring, employing a person to sit and watch students on screen as they take their tests.
Because no test-proctoring software is licensed through Pitt IT, the University has no data on how many departments or individual faculty might be using such tools, says Brady Lutsko, spokesman for that department.
But Bridges says that remote proctoring offers no solution “that fits all the various needs across campus” and that it “does not eliminate cheating.” He believes it is more of a distraction than a help, putting an extra burden on students — do I appear to be cheating, even if I’m not? What data are they collecting on me, who will see it and how will it be stored and accessed?
Having taught for many years, Bridges says he has observed students holding their heads in their hands for a few moments while taking a test, talking to themselves, reading exam questions out loud, looking up or off to the side to think — all behaviors that might be a false red flag to proctoring software.
“We know that some of our students have challenges with technology” during tests, he adds, and proctoring software may add extra complications. “The challenge is to think: when is it appropriate and necessary?”
The bottom line? Students cheat when they are overwhelmed, unsupported and under-resourced, Bridges says. Testing isn’t about trying to catch cheating. It is about demonstrating understanding.
Marty Levine is a staff writer for the University Times. Reach him at email@example.com or 412-758-4859.
Have a story idea or news to share? Share it with the University Times.