Case study

Loughborough University Closes the Feedback Loop with Blue Course Evaluations

School:

Loughborough University

Location:

Loughborough, UK

Details:

~17,800 students

Solution:

Blue Course Evaluation Software

Challenge:

The University needed to replace its paper-based module evaluation process so that it could respond to, and act on, student feedback data more effectively.

"Prior to the introduction of Blue, course representatives and School Presidents used to question where the reports go to and what happens next"

— Dr Sarah Williamson, Assistant Director of the Centre for Academic Practice


Key benefits:

  • Improved turnaround time through online system
  • Module leaders can quickly take action on student feedback
  • Through greater depth of data, staff have information to evidence their teaching practice
  • Senior leaders can access institutional data immediately
  • Students are clearer on how their feedback is being used

Ranked 5th on the 2019 Times Higher Education (THE) ‘Table of Tables’, and awarded Gold, the top accolade, in the national Teaching Excellence and Student Outcomes Framework (TEF), Loughborough University has consistently been recognised for its teaching quality and student experience.

Loughborough University has developed and implemented a ‘closing the loop’ process following the introduction of the Blue course evaluation software in the 2018-19 academic year – meaning that module leaders can respond to student feedback more effectively.

The University first established module feedback in the mid-1990s, an approach which sees students evaluate individual teaching staff as well as the module, but the use of paper-based surveys was unchanged. In 2015-16, the University worked in partnership with Loughborough Students’ Union to explore options to tackle the issue around ‘turnaround time’ in receiving data from these surveys. Dr Sarah Williamson, Assistant Director of the Centre for Academic Practice, was asked to advise on these.

“Essentially, we needed to get data and act on it – and the paper-based approach took forever to get processed,” Dr Williamson said. “We were keen to move to an online system but wanted to see if response rates would drop off first, so we ran a project through Moodle initially.

 We found that although response rates did drop there was a noticeable increase in the quality of the quantitative comments and depth of feedback. The Moodle pilot accelerated, and other academic schools got on board, and we realised that we needed a bespoke system. Once we had gathered evidence as to why a new module evaluation system was needed, and how it would support other surveys being conducted across the University, we sought a partner.”

Blue all-in-one evaluation system provides the support they need

After a competitive tender exercise, the University entered into a partnership with Explorance and implemented its Blue course evaluation system and Data Integrity Gateway (DIG) add-on, utilising a network of feedback administrators across the institution to help sow the seeds for module feedback in academic schools. “We are a small central team, only two members of staff, so it was important to have access to a wider and experienced support network,” Dr Williamson explained. “Also, because we evaluate staff who teach on modules – but do not currently have a central record of all contributors to a module – we needed an all-in-one evaluation system. Without it, there would be spreadsheets everywhere. Blue, and the way DIG is set up, has made this work – it has helped us achieve demographics of data, provided a wealth of information from students, and also supported us in closing the feedback loop.”

“We are a small central team, only two members of staff, so it was important to have access to a wider and experienced support network”

Dr Sarah Williamson, Assistant Director of the Centre for Academic Practice

The ‘closing the loop’ aspect requires all module feedback to be responded to by module leaders who access reports through the University’s VLE – which includes quantitative data for module and individual staff questions, and all free-text comments. They have to let students know what seems to be working particularly well with the current approach and why, or how the module might need to be improved based on the suggestions that arise from their feedback. “Prior to the introduction of Blue, course representatives and School Presidents used to question where the reports go to and what happens next,” Dr Williamson said. “Now our students comment on how they are able to give feedback and see results, which is important for the University. Being transparent and open to students, and what we do about it, has always been part of our promise to them. However, the approach we are taking now celebrates what is good, as well as highlighting areas which need to be addressed. It also means that we can draw on and share best practices with colleagues.”

Full confidence in Blue means full-scale module evaluation

Loughborough used to survey modules every three years, unless there had been a significant change within a module or a brand-new module, meaning “only a third and certainly no more than 50% of modules” would be covered each year. “Now we survey 100% of modules, and because of our confidence in the system, we introduced full-scale module evaluation from the off,” she said. “We surveyed nearly 2,000 modules in our first year and received 42,500 comments in the process. We have a standard question set, with variations for teaching types, all underpinned by a code of practice. Senior managers, including our Pro Vice-Chancellor (Teaching), can access data immediately. Actions can be taken and, because turnaround time is so quick, there is time to address things. Before it could take four months to get the data.”
Dr Williamson concluded: “Prior to our partnership with Explorance, we were unable to see any trends or patterns across our academic schools, response rates in academic schools were too low and too widespread to do anything with, and all we had was individual spreadsheets for each module. Through Blue, students are happy to have greater access to feedback and to be receiving responses, and staff now have information to evidence their teaching practice or help them know where they need to improve. Partnership with students is really important to us, and we hope that the more we can demonstrate that we are closing the loop, and their feedback is being acted on, the more that individuals will want to engage with surveys.”


BlueCourse evaluationsHigher educationStudent insight solutions

Automate your Course evaluations with best-in-class software.

Stay connected
with the latest products, services, and industry news.