Blog
Six Ways Universities Can Increase Engagement in Student Feedback Evaluations
Written by Explorance.

The Explorance Impact Tour 2025 made its way to London on 22 January, themed ‘Increasing Engagement in Student Feedback Evaluations’.
Around 90 delegates registered for the one-day event, further demonstrating its immense value for all higher education institutions. It offers insights into learning excellence, teaching effectiveness, and student experience.
Here’s a set of takeaways from the event:
Only Ask for Feedback if You Will Take Action
“We shouldn’t be asking students for feedback unless there is a plan to respond to it.” This statement from Tom Lowe, Assistant Head of School (Student Experience) and Chair of the RAISE Network at University of Westminster, was the most memorable line of the day.
Tom, who has edited A Handbook for Student Engagement in Higher Education: Theory into Practice and Advancing Student Engagement in Higher Education: Reflection, Critique and Challenge, also outlined the ideal student feedback loop for the student voice section of the National Student Survey (NSS), and his ‘live’ QAA sector study on The Audit of Student Representation and Voice Practices Project, including module-level evaluation practices.
But his opening keynote comment on the purpose of student feedback clearly resonated with fellow speakers and delegates alike. Discussion throughout the event often returned to this thought-provoking perspective.
Staff Buy-in is Critical for Good Response Rates
Flying into the UK for the event, the University of Minnesota highlighted its approach to finding “champions for the cause”, involving administrators, institutional staff and support staff, and how engagement at each level drives participation in the course feedback process.
Encouraging instructors to value student input, considering what staff need, and how processes can be created to support colleagues in achieving better response rates, were also among the lived experiences shared with attendees.
Meanwhile, Coventry University outlined the importance of staff engagement with evaluations to support good response rates, created by strong visibility and ownership of promotion and response rates. The University of Strathclyde shared its ‘Seven steps to making sense of class evaluation data’, staff support materials and annual cycle of continually-refreshed CPD. Pushing staff responsibility for, and engagement with, the returning data was another common thread across other presenting institutions.
Engage your Students in the Process
As part of Minnesota’s ‘Guide to improving response rates’, providing time in lectures/seminars to complete evaluations, reminding students when the data collection window is closing, and introducing class-wide incentives are all pre-identified steps.
The merits of introducing rewards and incentives for improving student response rates were explored throughout many presentations. Questions were raised, for example, on whether payment changes the ethos of the opportunity, or the students who wish to engage. However, advice on giving students dedicated time to complete the survey was a consistent theme. Fundamentally seeking to understand what students need, where students need more support, and what their pain points are supports engagement; and in turn good response rates.
Westminster’s Lowe reflected more holistically on citizens’ engagement in wider society, and therefore principles around speed (engaging faster), cost (engaging cheaper), and convenience (engaging with less movement or multiple engagements at once) should be considerations for student feedback evaluations.
Consolidate Student Survey Activity to Maximise Levels of Engagement
Tom also spoke about the challenge of “student voice en-masse”, and the differences in the scale of institutional approaches to evaluations today compared to his own time as a student and serving as a Students’ Union representative.
The University of St Gallen majored on the combined influence of short surveys, lecturer buy-in, program buy-in and student buy-in for influencing response rates and student engagement. A case-study presentation from Liverpool John Moores University highlighted its cultural shift to moving course level surveys into Explorance Blue, changing survey windows to align with NSS and piloting ‘Student Voice Season’. The University has also adopted Explorance MLY, one of a growing number of institutions in the UK and beyond.
Another topic of discussion was hearing from all students and recognising and responding to hidden barriers to participation. However, the presenting institutions also raised the importance of surveys being clear and well-defined in their purpose by the researcher(s).
Implement a “Tight” Feedback Loop to Promote a Positive, Supporting Culture
An expert panel discussion agreed on the importance of transparency in closing the feedback loop and the need for that loop to be as “tight” as possible. To demonstrate responsiveness, share responses and actions in real-time or at key points.
The panelists said that being seen to act on feedback also demonstrates a commitment to students and supports the concept of maintaining staff engagement in the process. Another discussion was about creating a culture around evaluations and promoting these as “supporting action planning rather than negative ethos tools.”
This could extend to sharing key themes emerging in feedback with teams who lead staff development, as part of wider academic development and learning and teaching practice. Nevertheless, providing better mechanisms for closing the feedback loops with students appeared to be an ongoing ‘work in progress’ aim for most.
See Students as Partners in Learning
A closing keynote from Ailsa Crum, formerly Director of Membership, Quality Enhancement and Standards, at QAA, concluded that students generally value a personalised and collaborative relationship with their university.
With many opportunities to capture student voice, Alicia highlighted the trend towards more dynamic ways of hearing views via surveys and joint exploration. On student engagement, questions were posed on ‘Why should they care about this survey?’ and ‘Can you give them reasons to care?’. But discussion ultimately returned to clarity on purpose and demonstrating what will be done with the responses received:
- What is it you’re trying to find out?
- What are you going to do with the information you gather?
- How are you going to report on it?
- What format will you use for sharing the outcomes?
- Are there decisions that will be affected by the survey outcomes?
Sign up for Explorance’s Upcoming Events and Make a Lasting Impact
Artificial Intelligence•Explorance Impact Tour•Explorance MLY•Higher education•Student feedback•