Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

September 27, 2012

Student opinion of teaching surveys going online

Early indications show that Pitt faculty are embracing the transition to online student opinion of teaching surveys.

Last spring’s bomb threats forced an early introduction to online surveys for some faculty who had to use them when classes were canceled or students had left the campus.

This academic year, faculty have the option to schedule an in-class paper survey or have the same survey sent to students via an emailed link.

Next fall, however, all the surveys will be administered online. The decision is the result of a pilot study and a review of other schools’ experiences with online surveys.

Nancy Reilly, director of the Office of Measurement and Evaluation of Teaching (OMET), said requests for the online surveys this term are overwhelmingly outpacing requests for the in-class version.

OMET typically gets about 3,000 survey requests in the fall term from Pittsburgh campus faculty. Reilly said that, as of Sept. 24, they have requested 2,308 surveys. Of those, 1,653 (72 percent) are for online surveys and 655 (28 percent) are for in-class surveys.

This term’s deadline to request an in-class survey is Oct. 5. The portal through which faculty can request online surveys will remain open through Dec. 2, she said.

Only the method of delivering the surveys is changing, Reilly said. Each school has its own forms; the electronic surveys are the same as their paper counterparts.

Faculty will continue to receive results electronically as a PDF file. The report will be the same, but now responses to the open-ended questions will be typed rather than handwritten, she said.


In addition to being more cost-effective than the paper surveys, the electronic surveys help conserve teaching time. Rather than interrupting a class to have a proctor administer the survey, students receive an emailed link. They can respond at their convenience via computer or by using a smart phone or mobile device.


Reilly said many institutions have been using online surveys for about a decade.

Two years ago, Provost Patricia Beeson requested that the Advisory Council on Instructional Excellence (ACIE) study the issue of moving the student opinion surveys online. Based on ACIE’s recommendation, a pilot study was conducted last fall.

The pilot covered 88 tenured instructors in chemistry, economics, English, history of art and architecture, psychology, engineering, education and the Graduate School of Public and International Affairs.

A total of 133 courses were surveyed online; 43 of the courses had been taught by the same instructor the previous academic year. In addition, 10 Pitt Greensburg faculty members who were teaching multiple sections of a course had one section surveyed online and another surveyed on paper.


Analysis by OMET staff and psychology faculty member Christian Schunn found no significant difference in the average teacher ratings or the amount of written feedback between the paper and online surveys. However, the response was much lower for the online surveys: For the Oakland courses in the pilot, the response rate was 75 percent for paper surveys compared to 53 percent for the online surveys. The disparity was even greater  — 76 percent vs. 39 percent — in a preliminary study from the 2010-11 academic year.

In its recommendation to the provost, ACIE did caution that lower response rates in small classes could increase variability in the mean ratings.

In order to get 100 percent participation in the paper surveys, all students needed to be in the classroom when the survey was administered, which was not always the case, Reilly pointed out.

Feedback from faculty involved in the pilot showed that 41.6 percent either were somewhat or very dissatisfied with the online response rate while 44.6 percent were somewhat or very satisfied. Nearly 14 percent were neither satisfied nor dissatisfied.

In the pilot, most faculty who commented found little difference in the quality (52.4 percent) and quantity (54.8 percent) of feedback from the open-ended questions done online versus on paper.

Of those who perceived a difference, 29 percent found the quantity of feedback to be somewhat or much greater online; 34.9 percent found the quality of feedback to be somewhat or much higher online.

Reilly said the surveys have greater face validity if higher numbers of students reply, but that even if only one student responds, his or her opinion should count.

Low numbers of replies could be interpreted in several ways, Reilly said. Perhaps the students are contented. Replies tend to come from students with strong opinions — both positive and negative — rather than from those in the middle.

Or the lack of replies may indicate students simply aren’t interested in offering feedback about the teacher. In that case, “Why the apathy?” becomes the question, Reilly said.

Encouraging participation

Reilly said weekly reminders for non-responders are sent during the three-week response window. A spike in responses typically follows each reminder, so they are effective, Reilly said.

To further boost responses, Reilly said OMET plans to place ads in The Pitt News to reinforce the message that students’ opinions are important. Faculty can add their own reminders, she said.

Research has shown that seemingly simple actions such as telling students that their feedback is valued and helpful to their professors can motivate a higher response to the online surveys.

“Students want to help,” she said, adding that the vast majority of students really answer the questions thoughtfully and that flippant responses or smart-aleck answers are rare.

Interestingly, while students’ comments tend to be more reflective when they answer on their own time rather than at a set time in class, the studies found the online respondents don’t tend to provide longer answers to the open-ended survey questions.


Reilly said a conservative estimate for initial savings (in student labor, paper and supplies) would be about $35,000 a year. Cost reductions will be higher, she said, after the surveys are transitioned entirely to the online system.

According to the ACIE report, the University spends $80,000-$90,000 annually to administer the paper-based evaluations, with about $60,000 of that amount attributed to the cost of labor. Shifting to the online surveys will enable staff time to be redirected toward other OMET functions.


The study reports and ACIE recommendations are posted at

—Kimberly K. Barlow


Filed under: Feature,Volume 45 Issue 3

Leave a Reply