Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

November 20, 2008

Teaching Excellence Fair: What worked? What didn't?

Pitt’s eighth annual Teaching Excellence Fair, held Nov. 5, included presentations from winners of 2007-08 innovation in education grants and conversations on teaching methods and techniques with faculty, as well as workshops and technology demonstrations led by Center for Instructional Development and Distance Education (CIDDE) staff. The event was sponsored by the Provost’s Advisory Council on Instructional Excellence and coordinated by CIDDE.

This year, many of the sessions were recorded in webcast form. Those webcasts are posted online at

“I tell faculty the one most important thing you can do to enhance your teaching is to reflect on the session afterwards,” said CIDDE associate director Joanne Nicoll at the teaching fair.

“Think about what worked and what didn’t work. Make some notes, think about what you can do differently the next time. Maybe you are trying a different teaching method or strategy or activity in your classroom. How did it go? Don’t just give up if it didn’t go well. Ask yourself what can you do better the next time.”

That was one of the strategies Nicoll discussed in her forum “Ways to Enhance Your Teaching.” The goals of the session were for graduate students and faculty who are early in their career to compare the advantages of various teaching practices and to identify the practice or practices best suited to the instructor’s course goals.

Nicoll recommended:

• Employing systematic ratings by students as supplements to the University-wide Office of Measurement and Analysis (OMET) end-of-term student evaluations.

• Using “master teachers” or senior faculty working closely with less-experienced instructors.

• Analyzing in-class teaching videotapes.

• Attending workshops that explore various methods of instruction.

• Using systems for faculty to assess their own strengths and areas that need improving.

Student ratings

“As a faculty member you don’t get that OMET rating until the term is over. So what is more helpful is informal classroom assessment that you can do more frequently during the term, for example, a mid-course review,” Nicoll said. “You can do it any time and it’s anonymous,” so students are comfortable with being honest, she said.

“It can be feedback on content or teaching methods. Maybe you’re trying some collaborative learning, some group learning and you don’t know whether students are liking it or not or whether they’re learning or not. This can be a good way of gauging that.”

Nicoll cited some ways to solicit student feedback during the term, including:

• The one-minute ungraded paper, where students are asked to identify briefly the most significant things they have learned, the major questions they have and how well the instructor is teaching the material.

“You can do this at the end of a class by asking students to summarize what they learned in that class,” Nicoll said. “Or, you can focus on a homework assignment or a lecture to see if it was effective.”

• Changing the prompt from time to time to elicit a variety of responses. For example, ask: What was the most illuminating example or the most surprising information or the most disturbing idea in today’s class?

Over-using this technique can make students think it’s a gimmick or a pro forma exercise in polling, Nicoll cautioned.

• The misconception/preconception check, whose objectives are to determine what misperceptions students have that might stifle their learning and how deeply embedded the misperceptions are.

A simple questionnaire can elicit this information, Nicoll said, but it’s best to have an experienced colleague review the questionnaire so it doesn’t sound patronizing, threatening or obvious.

The technique is especially useful in courses that deal with controversial or sensitive issues, because it can bring to light stereotypes and other notions that inhibit learning, she said.

• The muddiest point check, that is, what do students find least clear or most confusing about a particular lesson or topic.

This technique is well-suited to large, lower-division classes to determine how widespread particular muddy points are, Nicoll said. The instructor can sort the student responses into groups of related points and respond to or clarify them in the next class meeting.

Experienced instructor as mentor

“Many departments, when new faculty come in, assign a mentor to a new faculty member, not only for content, but also for teaching,” Nicoll said. “We expect faculty here to be good teachers, so it’s useful to have someone to help them by doing some formative peer review.”

That peer review process includes the mentor and instructor meeting to discuss the course and syllabus; identifying the goals of the class; observing the class, and making observations with concrete notes of events, she said.

“You don’t just show up and observe someone. You need to find out beforehand what the goals of the class are; you need to have a discussion about the syllabus, about the desired learning outcomes. You need to take concrete notes so you can give specific feedback: identify strengths, recommend some enhancements, create an action plan,” Nicoll said.

CIDDE has developed guidelines to enhance the peer review process that are available at

If no mentor is available, instructional designers at CIDDE are trained to fill that role, she added.

Videotaping of classroom teaching

“CIDDE provides videotaping services to graduate student teachers,” Nicoll said. “Our TAs and our TFs say this is the most important thing to them; it’s how they best learn. We videotape a class, then an experienced teaching fellow at CIDDE helps them analyze that tape for what’s working and what’s not.”

The analysis is crucial, she said. “Probably you won’t gain as much just watching yourself. But to have someone else watch it with you, someone you trust, a colleague or a CIDDE instructional designer who would meet with you ahead of time, is more beneficial.”

CIDDE instructional designers have a graduate-level background in learning theory — how people learn — and instructional theory — how to help people learn, Nicoll noted.

Workshops on teaching methods

CIDDE hosts a number of instructional development workshops. “From the educational research, these workshops are a good beginning, but they’re only a beginning. They can give you some new ideas about how to do different things in your classroom. But it’s important to have support as you go on to use different methods and strategies,” Nicoll pointed out. “You can get that support here at Pitt with our instructional designers.”

Examples of teaching workshops include: “Developing Teaching Portfolios,” “Interactive Teaching and Learning” and “Best Practices in Online Teaching.” (See the CIDDE web site for more information.)

Systems to assess your own skills

“These systems can give you some direction and focus, can help you review your syllabus — what’s working, what’s not — and review your OMET forms,” Nicoll said.

“When I review a faculty member’s OMET ratings, I don’t look at numbers. What I want to see is the open-ended responses that students make, such as ‘instructor gave a lot of useful examples’ or ‘instructor did not give good examples’; ‘instructor was not available enough outside of class’; ‘instructor showed enthusiasm for the subject’ or ‘instructor lacked enthusiasm,’” she said.

“Then I do a content analysis. I take them question by question and look at the comments. The strengths, the good things are going to bubble up to the top, and areas that need to be enhanced will bubble up, too. So a content analysis is one way to make sense out of those open-ended comments. You’ll get a couple of absolutely clear directions from a few of your students of what to do.”

Nicoll cautioned against putting too much weight on the “outlying” comments.

“They didn’t like anything you did; they don’t like the way you wear your hair, the clothes you wear. Forget those,” she said. “Look at the positive things and try to do more of them. Then look at some of the areas for improvement that the students suggest: There were no transitions; lecture information didn’t come across; teacher never lets us do anything; there was no engagement.”

To counter that last objection, CIDDE personnel will help instructors use more active learning and other classroom organizational techniques, Nicoll said. “You are the experts and you have to remember you’re dealing with naive learners. The more organization and structure you can provide, the easier they will learn and the more effective you’ll be as a teacher.”

Other ways in which CIDDE instructional designers can help faculty include assisting with course development and revision.

“You’ll be more effective in the classroom if you have solid planning and incorporate ways to engage your students,” Nicoll said. “One thing I wanted to get across is that teaching enhancement and course development: You can’t separate the two.”

Instructional designers also can help develop new teaching methods and strategies, observe faculty in class and give feedback about how well a new strategy is working, Nicoll said.

“Observing and talking about it afterwards is called clinical supervision. It’s a collegial relationship. We have buy-in. We can work with you over an entire term or week by week. Or, we’re available over the phone or in an email, to give you advice if you wanted to try, for example, a new wiki,” a collection of web pages designed to allow students to contribute or modify content, she said.

“Our philosophy at CIDDE is we construct instructional technology based on the learning outcomes for the students: creating objectives, learning activities and assessment. How are you going to measure how well the activities are meeting the objectives? These are all correlated. Is a wiki really going to work here? Let’s not do a wiki if it’s not going to meet the instructional needs and goals.”

—Peter Hart

Filed under: Feature,Volume 41 Issue 7

Leave a Reply