Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

May 16, 2002

Turning the tables: How the evaluation process works

Deep in the recesses of the Cathedral's ground floor, four Pitt staffers and their boss, surrounded by half-stuffed campus envelopes, long for the day this month when the Mailing Services delivery truck will collect their term-long bookkeeping efforts.

On the first day of each term, the Office of Measurement and Evaluation of Teaching (OMET), mails some 3,000 teaching evaluation request forms to faculty members. Additional school-specific forms are sent to dean's offices to distribute to adjunct and part-time faculty on one-year appointments.

That's just the beginning. Between then and the post-semester pick-up of the tabulated evaluation results lies hiring, training and scheduling student proctors; checking for course-number accuracy and other information on the forms; tabulating the data from evaluation questionnaires; stuffing envelopes with school-specific explanatory materials; making sure that the confidential forms are addressed correctly, and, finally, ensuring that final grades are submitted to the Registrar's office prior to the release of the forms.

Results are returned confidentially to faculty only after final grades have been submitted to the registrar, so that grading cannot be influenced by the evaluations.

Although this time-consuming process takes up much of the time of the OMET staff, the director of the office emphasizes this is not all her office does.

"We provide many services to the University community, including standardized test administration, test scoring and research consultation, in addition to overseeing the student evaluation of teaching process," says Carol E. Baker, OMET's director.

Since most Pitt classes are scheduled between 11 a.m. and 2 p.m. on Tuesdays and Thursdays, coordinating student proctors, who also have their own class schedules, to give faculty their preference for when to schedule the evaluations is difficult, Baker said.

To help ensure confidentiality, OMET sends student proctors to distribute evaluation forms. Proctors make sure the faculty member leaves the classroom and bird-dog the students while they fill out the forms. The proctor collects the forms and seals them in an envelope with the course information on it and returns the envelope to OMET.

OMET then compiles and summarizes in computer printout form the results of the survey, including the percentage of students who responded by question, the mean (how the class on average responded) and a breakdown by question of the responses into the (typically five) categories.

OMET staff attach the students' open-ended comments and the results of additional questions requested by the faculty member to the summary forms.

The report OMET sends to the instructor also contains discipline-specific comparative data. For example, in the Arts and Sciences, OMET provides comparative data based on a random sampling of fall term 1997 student evaluations of 157 undergraduate classes taught by tenured and tenure-stream faculty representing the natural sciences, social sciences and humanities.

"This gives instructors another point of reference," Baker said. "Also, we're responsible for compiling such 'norming data' for the schools, which we try to update periodically."

Classes with fewer than four students are not evaluated by OMET. Classes with 4-6 students are given an alternative, shorter form, which combines fewer standard questions and the open-ended comments form, and OMET does not calculate the class mean in the results.

The office also handles the evaluation process for three of the regional campuses (Johnstown uses the Educational Testing Service), although, because of mailing costs, the results are mailed to a central regional campus location for distribution instead of to individual faculty addresses, which is the procedure on the Pittsburgh campus.

–Peter Hart


Leave a Reply