Thursday, February 10, 2011

My teacher is trustworthy, loyal, helpful, friendly, courteous...

I first encountered them at Oberlin 37 years ago last fall. At the end of the semester, a few of my professors passed out forms that they had designed themselves, in which they invited their students to critique their performance. I don't remember what the questions were, but they required thoughtful, paragraph-long answers. The professors handed them out and collected them themselves. That's because they intended to read them immediately and use them, if possible, to improve their teaching. Whether anybody else ever saw them I have no idea.

Fast forward to my first full-time teaching job eleven years later. I was at a large state university on the west coast. Student evaluations were now required in all classes. The form asked a series of multiple choice questions that were answered by filling in a "bubble" with a no. 2 pencil. They asked things like "The instructor is well prepared for each class section." A. Strongly agree. B. Somewhat agree. C. Agree. D. Moderately agree. E. Faintly agree. F. Neither agree nor disagree. G Faintly disagree. H. Moderately disagree. I. Somewhat disagree. J. Strongly disagree. K. No opinion. They were handed out and collected by a student volunteer, while the professor left the room.

Students had about 10 minutes in which to answer about 30 such questions, and, if they had any time left, they could also respond to a few questions on the back of the form that required written answers. Most didn't bother, having been taxed to the maximum by having to decide whether they "somewhat agreed" or only "slightly agreed" with the statement that the instructor used clearly established criteria to evaluate their work. The results of this exercise in hasty judgment, based on criteria that were neither clear nor established, were tabulated in a computer printout, which went into each instructor's permanent file. These results constituted evidence of "teaching effectiveness." Careers hung in the balance.

The experience I describe is, of course, familiar to anybody who has either attended or taught at any American college or university during the last three decades. Answers to multiple choice questions on a computerized form are used to determine whether A is a better teacher than B, because A "begins and ends class on time," while B doesn't.

I was therefore deeply gratified to hear someone (I didn't catch the name) point out on NPR the other morning that the universal dependence on these forms has had one rather obvious, and predictable, result. The scores that students give their professors directly correlate with the scores the professors give them. In even simpler language, the better grades a professor gives, and the less work he or she requires, the better student evaluations that professor will receive.

Given how obvious this is, and how long people have had to think about it, I am a little taken aback by the surprise occasioned by recent studies suggesting that most American college students are not getting much out of their education. After all, if they were challenged to write 20 pages a semester and work three hours outside of class for every hour they spend in class, and if they were given C's on their papers when they were convinced they deserved As, their professors would be fired.

Now, of course, we are all accustomed to evaluation fatigue. We cannot even call Sears to arrange to have a dishwasher installed without being invited to take a survey about whether we found our conversation with the service representative "excellent," "good," "average," "below average" or "poor." I have spoken to enough people working in service jobs to know that there is only one acceptable answer to such questions. If the 30-second conversation we had with them is rated anything less than 'excellent," heads will roll. I just hope I haven't gotten anybody fired by refusing to take the survey. The fact of the matter is, I've already answered that question. Or maybe it's already answered me.

4 comments:

  1. I have long thought that the sort of student evaluation most commonly used in colleges and universities isn't worth much. In the calm excellence of one's 19-year-old wisdom, how is one really meant to judge one's professor?

    Oh, OK. I have also often compared this to asking the lamb chop just how it wants to be cooked.

    When I was an undergrad, I routinely gave every professor the highest possible marks, just to demonstrate my contempt for this sort of evaluation. If you are a student who is reading this, I strongly urge you to do the same. Skew the data-base and mess 'em all over. Do it twice, if you can figure out how.

    ReplyDelete
  2. This sort of evaluation process, if used in isolation and if done without prior consultation with the individual faculty, is a badge of unseriousness for the administration.

    I created and used student eval forms for my various courses before our Office of Curricular Affairs did theirs. Major challenges are (1) participation and (2) avoiding the beauty contest. For 1, I stapled the eval onto the back of the exam and as students turned in the exam, I'd note if they put the eval in a separate pile. If not, I'd ask them personally to please turn one in. Yes, some were blank, but I usually got ~70% participation. For 2, I not only asked the "rank from 1-10," but also for statements of strengths and suggestions for improvement.

    Now, our Office of Curricular Affairs does this online. Each year, I approve the wording of all questions in advance.

    Now, the part that is unserious is the over-reliance on student feedback without faculty peer review. This abdicates faculty responsibility for curriculum content. Only if faculty sit in on each others lectures and review exams and syllabi can they exercise informed judgment about teaching. But this is hard work and time-consuming. Much easier is taking the student temperature and using that as a proxy.

    I have garnered a couple of teaching awards in my school, but each year I'll get a couple of critical comments. Out of a class of 175, even Mother Theresa would get a negative comment from at least one or two students. But 30-40% negative student reviews is a symptom IMO, and 90+% negative reviews is a guide to action.

    ReplyDelete
  3. "I am a little taken aback by the surprise occasioned by recent studies suggesting that most American college students are not getting much out of their education."

    I haven't read the actual study. Have you?

    I have only read summaries like this:
    http://thechoice.blogs.nytimes.com/2011/01/17/academically-adrift/

    "It is worth noting that in measuring broad analytic and problem-solving skills, the exam does not assess how much students concentrating in particular majors — physics or psychology, for example — have learned in their respective fields of study."

    Well, OK, then. What are we to make of the critique that "45 percent “demonstrated no significant gains in critical thinking, analytical reasoning, and written communications during the first two years of college.”"

    Could a valid interpretation be that 45% of students who survive the first two years already possessed decent critical thinking, analytical reasoning and written communications skills when they were selected for admission? Remember, this isn't 45% of freshmen.

    I dunno. I believe college is what you make of it. Plenty of students can and do skate by.

    For employers, a college credential in many fields actually doesn't guarantee training to the task. It is a badge saying that this person is a 'finisher.' To graduate college, you have to take and pass many long courses. Whether or not they are all demanding, you still have to reach the end with a passing grade, over and over again, with different material and different evaluators. For many college students, they were admitted with this aptitude.

    ReplyDelete
  4. Joel, it sounds like you're using the student evaluations appropriately. We also do a pretty good job of observing each others' classes at Baylor, and the write-ups from these visits count at least as much as the student evaluations.

    At the big state university I mentioned, OTOH, nobody ever came to observe my classes; the computerized student evaluations were the only measure of teaching quality they used. I really do believe that such policies have encouraged grade inflation and devalued the college experience for many people.

    ReplyDelete