Some faculty agree course evaluations don’t provide enough feedback.
Some faculty agree course evaluations don’t provide enough feedback.
Toward the end of each semester, students are inevitably badgered by emails reminding them to do one thing — fill out course evaluations. While these notices can be a little tiresome, student course evaluations are a crucial tool for evaluating faculty performance used by both faculty and administration.
Course evaluations at Loyola are online surveys students fill out anonymously to review both their courses and instructors from the semester. The majority of the questions are multiple choice scales asking students to rate things like technology used in the class, overall effectiveness and constructive feedback from the instructor, with the opportunity to leave their own comments as well.
Lecturer in dance and recording secretary of the College of Arts and Sciences’ Faculty Forward Union Deborah Goodman said peer reviews — when another faculty member sits in and observes a class — aren’t rigorously conducted, leaving the student course evaluations as their only form of feedback.
“That’s problematic because then those teachers don’t always get the options to grow professionally, which is what you get from peer review,” Goodman said. “It also sometimes has held teachers back when it comes to promotions and appointments.”
According to Goodman, full-time faculty are reviewed annually by their departments while part-time faculty are only evaluated one time after they’re initially hired. Even then, these evaluations are often not conducted whatsoever — something she said Faculty Forward is currently trying to partner with the university to fix in their negotiations.
Full-time faculty are “ordinarily” reviewed annually, with this review including a self-evaluation and a performance evaluation by either the department chair or academic supervisor, according to page 40 of the Faculty Handbook. The practices for reviewing part-time faculty are different between departments, according to page 41.
In 2019 and 2020, over 80 tenured professors were bought out by the university and left, The Phoenix previously reported. Goodman said the resulting shortage of tenured faculty has also contributed to the decrease in peer reviews.
“Tenured faculty are the people who are supposed to be doing this kind of service with the university, and there are just so few, so they’ve really fallen,” Goodman said. “The non-tenured faculty take up the work, it’s like an extra burden, but people take it up.”
Eric Chan-Tin, a computer science professor and vice-president of Loyola’s chapter of the Association of American University Professors, said while it varies by department, he has no mandatory peer observation. Chan-Tin was recently promoted from an associate professor, and as part of his promotion process he submitted the results of his student course evaluations which he said was optional.
“So we get evaluated on three, at least I get evaluated on three aspects: Teaching, research and service,” Chan-Tin said. “So the teaching part I did include my student course evaluations, but I don’t know how much the committee that reviews it — how much they evaluate it, how much weight they put on it.”
Jacob Leveton is a part-time instructor of art history, visual culture and the humanities and began at Loyola last January. After starting, Leveton was peer reviewed by his faculty supervisor who took part in one of his classes. He said the feedback from the peer review was excellent, but also said the student course evaluations are vitally important.
In addition to the student course evaluations and the first peer review, Leveton held his own mid-semester review with students during his first year at Loyola. He said as someone driven not just by good teaching but good mission-driven teaching, he felt the only way he could align his teaching with student experience was through an informal mid-semester review.
“I spent a lot of time last summer reading through those evaluations and built what ended up being like a hugely successful introduction to art and visual culture course for [the Department of Fine and Performing Arts],” Leveton said. “That really arose from a wish to think rigorously with what students were giving and to really get outside of what I’m interested in to really center and uphold what students were interested in.”
While student course evaluations do provide valuable feedback to instructors, they’re also easily subject to student biases. The tendency for student course evaluations to be biased against women and people of color has been well researched despite their widespread use across higher education.
Goodman said she doesn’t think faculty hate the student course evaluations but does think they’re still a little problematic for faculty of color and women.
“I’ve heard people say what they’re reading are biases and not something about the class or useful information,” Goodman said. “And this is where it’s like it would be nice if students understood, when you fill out the evaluations, this is the natural way that biases play in. Just be mindful when you’re doing this, are you playing into any of these biases?”
Research has shown students tend to be more critical the worse their grade is, potentially turning the evaluations into simply a means for exacting revenge on tougher teachers.
A minimum of five responses are required for the results to be released to an instructor, according to Goodman and Chan-Tin. Chan-Tin acknowledged it’s difficult to please all students in a class because of different learning or teaching styles, and Goodman said while she appreciates the feedback, the only students who comment are ones who either loved or hated the class.
“If you have 50 students, probably one or two might not like you,” Chan-Tin said. “Might be because they don’t like your teaching style, they might not like you personally, you know, which, I mean, again you cannot please everybody so that’s fine. But if you have 48 students that don’t like your teaching, that means it’s you and not the students.”
Vice Provost for Faculty Affairs Markeda Newell told The Phoenix in an email statement the university appreciates the time students take to fill out the evaluations as they are integral to providing faculty constructive feedback and evaluating their performance.
“We are all committed to ensuring faculty are supported in their teaching, and course evaluations provide insight into the excellent teaching that is occurring across the University as well as the types of support faculty need in order to provide students with a rich learning experience,” Newell wrote.
Goodman said it’s an issue how student course evaluations are often the only way many teachers get evaluated, and the university understands this is an issue. Chan-Tin said the student course evaluations are the main metric in many departments, but other metrics like peer observation can also give important feedback otherwise missing from the students.
For now, Goodman described the results of her semesterly evaluations as “a source of joy,” while Leveton stands behind his student course evaluation record and said students should feel the time they spend filling out the surveys is valued.
Hunter Minné wrote his first article for The Phoenix during just his first week as a first-year at Loyola. Now in his fourth-year on staff, the Atlanta-native staff writer is studying journalism, political science and environmental communication alongside his work at the paper. For fun he yells at geese.