What role does module evaluation feedback play in “value for money” challenge?

Written by John Atherton.

The political mist surrounding the UK Higher Education sector – in fact, the whole country with a General Election looming on 12th December – means there is huge uncertainty for everyone right now and planning for the future is very difficult.

What universities do know, regardless of who ends up in No.10 next month, is that the challenges around student voice, student engagement and student satisfaction remain and will only rise up Vice-Chancellors’ lengthy priority lists.

Last month the Office for Students (OfS) published its “value for money” strategy, with the primary measure for this to be based on the perceptions of students and graduates. In practice, this will mean surveying students and graduates about their views on value for money as well as datasets around graduate earnings. This has brought criticism from some Vice-Chancellors, but the idea of polling students seems set.

We also know that the OfS plans to extend the UK’s National Student Survey (NSS) to cover all undergraduates over the next two years. It is thought this will address concerns that students are asked to complete the survey when doing their final assessments – during which they might have limited time to respond or might not feel positively about their institutions.

What this move would do, of course, is give a voice to all students and provide a much richer picture of the student academic experience. On the flip side, universities have expressed concern about the burden it would create in gathering data from all students, every year, with fears around response rates plummeting and data being unusable. There will be a consultation on the survey questions in spring 2020.

What we are noticing is an upsurge in UK universities looking to get to grips with the issue of student voice. The NSS poses questions on how students have the opportunity to give feedback and how their feedback is acted on – and the Teaching Excellence and Student Outcomes Framework (TEF), which provides a resource for students to judge teaching quality in universities, draws on data from NSS. So there are massive strategic drivers for this now.

As highlighted in our report, The Student Voice, published this year, module evaluation surveys are recognised by senior leaders as playing a strategically important role in supporting the bigger picture, providing institutions with the opportunity to respond to any issues and concerns before the NSS is completed. They also enable a valuable opportunity for individuals, departments, faculties and universities as a whole to reflect on their teaching practice and the student experience within that.

This month we have held two Feedback Matters workshops at the University of Leeds and the University of Strathclyde. Both highlighted similar areas of discussion and challenge, with the Leeds workshop focusing on the role of student voice within decision-making and how the University supports students with the skills to provide valuable feedback. The Strathclyde workshop covered this, but there was also some depth of discussion on the logistics – timing, curation of question banks, corresponding policy for module evaluation and development/ design and use of aggregate reporting to maximise the benefit from the data gathered. More workshops are planned over the coming months.

Through our work we gather stories, experiences, best practices and lessons learned about student voice initiatives from our user community. This is directly helping a growing number of universities in the UK to improve teaching and learning, and gain insight into the student experience to support projects relating to NSS, TEF and now the “value for money” challenge.

 

 

John Atherton is Higher Education Director (UK & Ireland) at Explorance

 


BlueCourse evaluationsEducational experience

Want to learn more about Blue?

Stay connected
with the latest products, services, and industry news.