Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

January 24, 2013

Online course evaluations get mixed results

The first semester of optional online course evaluations produced mixed results: Faculty in half of Pitt courses switched from paper to digital assessments, but just half of Pitt students took advantage of the new digital forms when they were offered.

The results were no surprise. A Pitt pilot study produced similar results and matched the experiences of other universities making a similar switch, said psychology faculty member Christian Schunn, who co-chaired the Advisory Council on Instructional Excellence (ACIE) that recommended Pitt move to digital evaluations. ACIE already is consulting with Pitt faculty and administrators to improve the student response rate for fall 2013, when only digital evaluations will be offered.

“Our charge from the provost is to keep an eye on it and make some mid-course corrections if necessary,” said Schunn.

According to Nancy Reilly, director of the Office of Measurement and Evaluation of Teaching (OMET), 70 percent of fall 2012 courses were evaluated by electronic means.

On the Pittsburgh campus, 2,723 classes used online surveys versus 1,002 classes using paper.

Of the regional campuses, only faculty at Bradford favored paper over digital evaluations, using 177 traditional forms and 85 online forms. In classes at the other campuses, online evaluations were chosen more often: 133 to 59 at Greensburg, 284 to 66 at Johnstown and 65 to 40 at Titusville.

Across all campuses this fall, Schunn said students responded to 53,620, or 51 percent, of 104,350 possible online evaluations. Previously at Pitt and at other campuses using only paper evaluations, the student response rate typically has been around 70 percent.

Pitt did not track whether students answered more or fewer questions on the online forms than they had on the paper forms; Reilly said that would have been too labor intensive.

Schunn said ACIE wanted to see what influenced whether students completed the online survey. So during the fall evaluation period, students received emailed reminders about filling out the online forms; ads in The Pitt News and notices on the Pitt web site were used to provide additional reminders. Then ACIE tracked bumps in daily student participation rates that corresponded to each promotional method. The “pester emails” had a large effect on student compliance, whereas the newspaper ads and web site notices had virtually no effect, Schunn said.

ACIE also asked faculty who had chosen digital evaluations whether they had encouraged students to go online and complete the forms. Of the 600 faculty who answered the survey, most said they gave students no reminders. Those who did give in-class verbal reminders had more students completing the online evaluations, ACIE’s study discovered.

“We found that both talking about the importance of the surveys and reminding people to do it were useful,” said Schunn. “But the effect of doing that was more useful for graduate classes than for undergraduate classes.”

There are other factors that affect online evaluations. While large lecture classes that had a lower percentage of class attendance historically, and thus a lower percentage of paper evaluations completed, might see a bump in evaluation responses when the forms are placed online, smaller classes with high attendance percentages might see a drop in evaluation participation when the forms move exclusively online.

“We are still discovering what kind of additional strategies we will do to try to bring up response rates,” Schunn said.

ACIE currently is trying to determine what fits into the varying evaluation rules and traditions across different Pitt schools. In some schools, Schunn pointed out, evaluations are optional for faculty, while other schools require evaluations. In some schools, only the instructor sees the evaluation results, while in others the results are available publicly. Some schools use the evaluation as one tool among many assessments, whereas other schools use the class evaluation as one of their most important faculty appraisal methods, Schunn said.

Department chairs and deans also were surveyed by ACIE. “The extent to which there was concern, it was almost universally about the response rates,” said Schunn.

Reilly noted that there are many advantages to the online surveys. “Students can do them at their leisure. It does not take up class time. We did go green. And there are money savings.”

Last September she estimated  that savings in labor by student proctors, as well as in supplies, would equal $35,000 of the approximately $85,000 spent each year to administer paper evaluations. She called that estimate “kind of conservative. We will know better when it is all said and done.”

Overall, she believes, the process went well. Her office currently is receiving more than four times the number of requests for digital evaluations than for paper forms, she said.

“But that could change — it’s still early in the term,” she cautioned.

—Marty Levine

*

Faculty can continue to request online evaluations until April 15; the deadline to request paper-based in-class evaluations is Feb. 8.

• To request an online evaluation: Log in to my.pitt.edu, click the My Resources menu and select OMET Survey Request. After selecting the OMET link, a window will pop up with a list of your classes. Choose the class you would like to be surveyed. After verifying your class information, submit your request.

• To request an in-class evaluation: Follow the same steps as above. However, before submitting your request, click on Paper Based Survey Option. This will take you back to a list of classes. Choose your class again and a calendar will appear. Choose three possible dates and times for your survey, then submit your request. Instructors choosing an in-class evaluation process will receive a confirmation letter at their on-campus address with the date, time and location of the proctor-administered evaluation.


Leave a Reply