Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

June 11, 1998

LETTERS

To the editor:

Professor Nathan Hershey made several excellent points regarding student evaluations during his interview with the University Times (May 14, 1998).

I would like to add another one. A crucial variable that is apt to affect students' evaluation of the professor and the course are students' notions of just what constitutes a "good" teacher and a "good" course. For example, if a student's conception of a "good" professor is one who uses a straight lecture format relatively devoid of discussion among students and the teacher, professors using such a format are very likely to receive high evaluations. Professors whose teaching style emphasizes discussion, experiential exercises, and a Socratic approach aren't likely to receive favorable evaluations by students expecting and preferring a straight lecture format.

Another significant variable that affects student evaluations is students' conceptions/definitions of course characteristics featured in questions. Responses to how "organized" a course was, for instance, depends BOTH on students' definition of "organized" and their personal expectations and preferences. Those who define organized in terms of a rigidly structured course where the professor follows the assigned readings in a strict fashion and seldom deviates from the topics and time-frames specified on the syllabus are apt to give the professor high marks for course "organization." Conversely, professors whose philosophy of teaching leads them to structure their courses in a more flexible fashion, allowing for varying degrees of spontaneity based upon students' differential interest in topics, relevant current events and the availability of guest speakers are not likely to receive high evaluations from students who define "organization" in the way previously cited. Unfortunately the student evaluation instrument used does not even attempt to determine students' expectations, preferred teaching style and conceptions of key terms used in questions. Data on these important variables will significantly improve the interpretation of student responses.

Michael Klausner

Associate Professor Sociology

Bradford Campus

—————–

Carol E. Baker, director of the Office of Measurement and Evaluation of Teaching (OMET), responds:

Professor Hershey's and Dr. Klausner's comments provide the opportunity for me to remind faculty of the importance of not presenting the data generated by your student opinion of teaching surveys to others without an accompanying description. It is important to combine the summary printout you receive from OMET with your own summary describing the nature of the course, the approach to teaching you used, the level of students, the required/elective nature of the course, and any other information that will enable decision-makers to examine the student perceptions in context. One of the advantages of the system in place at Pitt for collecting students' opinions of teaching is that you are given the opportunity to add items that will provide you with information about student reaction to your specific teaching approach and data about student expectations, learning styles and/or preferences. Items can be selected from the additional item bank or you can write your own. You can tailor your survey to provide feedback that will be most useful to you.

This flexibility is rare among large universities like ours where over 3,000 classes are evaluated each term. Results from a number of research studies indicate that when faculty and students are asked about the characteristics of good teaching, a common core of factors is identified.

Instruments have been developed in most of the schools within this University by committees of faculty and students who were careful to include items that represent these common factors and also reflect the unique nature of the curriculum. We must take as much care in interpreting our results to others.

With respect to examining the data returned to you, although class means are presented on a number of the instruments in use across the University, you can gain a more complete picture of student perceptions by also examining the variability in your ratings, as Professor Hershey did. Some variability is to be expected across students, but if ratings are very disparate we indicate that on your printout so that you can discuss possible reasons in your descriptive statement. The responses to the open-ended questions may also help explain the variability you see in the ratings and can be extremely useful for you in planning future courses.

Since students' opinions provide only one source of information about your teaching, it is encouraging that the Senate Education Policies Committee annually requests an update about both student and peer review of teaching policies in each of the academic responsibility centers. As peers are used more regularly to provide data about those aspects of teaching that they are in the best position to evaluate, you will obtain additional feedback about your teaching.

In the recent norming of the revised Arts and Sciences student opinion of teaching questionnaire, data from a random sample of classes indicated that 82% of the students probably or definitely would recommend the course to other students and 83% probably or definitely would recommend the instructor to other students. Believing as I do that students' opinions are valuable and valid, I am impressed and encouraged by these results!


Leave a Reply