Transitioning from a paper course evaluation system to one that is online is a great way for Higher Education Institutions to streamline their student feedback processes. However, like any form of change, there will be questions and concerns centered around change management and transition. Explorance experts recently discussed with panelists from Rutgers – The State University of New Jersey and Washington College. They offered some insights into their institution’s experience transitioning online with the Blue course evaluations software.
Associate Director for Instructional Technologies
Christina Bifulco, Ed.D.
Associate Director for Teaching and Learning Analytics
Matt Kibler, Ed.D.
Assistant Vice President for Institutional Effectiveness and Analytics
How did the paper process function at your Institutions?
Both institutions had a similar process for their paper evaluations. Washington had a staff member who would take about two weeks to print out evaluations, prepare packets, and distribute them to faculty. During the last two weeks before the end of the semester, faculty would have a student administer the course evaluations in-class.
“Basically, the two weeks packaging the envelopes and four weeks scanning and fixing about 75% errant pen marks, changing comment boxes and things like that was all the analyst did,” said Matt Kibler Assistant, Vice President for Institutional Effectiveness and Analytics
at Washington College. “That’s what it looked like to format reports, about 6-7 weeks to process when you include the time for administering them.”
Rutgers’ process was comparable. They are a much larger institution, so they had a small army of students, part-time, and full-time staff preparing around 100,000 forms for 8,000-10,000 courses across the entire university.
“We were more like 2-4 months,” added Monica Devanas, Director, Teaching Evaluation and Faculty Development at Rutgers. “Getting all those packets back and arranged in numerical order, cleaning them out, scanning them ourselves, stuffing them back in, printing a statistical report, stuffing that back in, and shipping them back to their departments. That was how we started with paper, and it seems so painful now.”
What were the factors leading to an online course evaluation system?
The two institutions had similar main requirements – they were both looking for increased time and efficiency as they seek out an online course evaluation solution.
Joseph Delaney from Rutgers elaborated. “We had a small number of online courses that needed online surveys. And for a while, they ran their own systems, so they weren’t using the centralized system. But we needed a way to accommodate those, and as more courses went online, we had to come up with a solution for that.”
Washington was also in search of increased reliability. On the evaluation forms themselves, sometimes, it was difficult to read the handwritten comments of students.
“We had students who either forgot to hand in or lost the entire packet of course evaluations,” said Matt Kibler. “We had faculty members who would go through the semester and had forgotten that they were supposed to do it, so they just didn’t.”
What were the obstacles transitioning to an online system?
Both Washington College and Rutgers faced opposition from faculty when attempting to adopt an online course evaluation system. There were concerns that faculty would have less control over how evaluations were administered. There were also fears that there would be a significant drop in response rates.
“Response rates were a big issue at Washington College for convincing the faculty,” Matt Kibler continued. “On paper, we were getting a 90% response rate, and what I was able to prove with data is that with the online option, we were still getting about 80% response rates from the methods faculty members were using.”
Faculty at Washington College was also concerned that students would be more inclined to leave negative comments on an online evaluation form.
At Rutgers, Joseph Delaney observed that “there was a sense that the paper system was just a better process in some intangible way. The students could communicate more effectively on paper rather than online, or they would respond in a more appropriate way if it were on paper.”
What are the benefits of having the Blue evaluation system?
Both institutions noted the speed at which evaluations can be processed. Instead of taking weeks or months to analyze evaluations, it’s done the day after the registrar’s office finalizes grades. Both Washington College and Rutgers found it very easy to share information across departments and with faculty. Blue also delivers greater analytics and reporting compared to previous systems. It also offered increased reliability of data and simplified the process overall.
“The speed of processing, of getting information back to the faculty, that was the primary benefit, and that was immediate to everyone,” said Joseph Delaney. “Where we went from 3 to 4 months of processing the paper survey, we are now getting the results back to the faculty the day after the semester ended.”
Joseph Delaney continued. “We went from a mediocre 70-75% of courses using a survey at all to having everybody able to use a survey. Every student got a chance to vote, not just those that appeared in class on that one day.”
At Washington College, “faculty can read the comments, and students can write as many comments as they want,” said Matt Kibler. “That has been a huge plus because our Tenure and Promotion Committee use the comments directly for the tenure process. So that’s really key. We like that a lot.”
Blog•Blue•Course evaluations•Educational experience•Higher education•Student Journey Analytics•