Blog

Driving Student Engagement in Module Evaluations at Glasgow Caledonian University: Pro Vice-Chancellor Perspectives

Written by Explorance.

Students of different departments discussing

In the first of a regular series of conversations with Pro Vice-Chancellors at Explorance Blue customer universities, we met with Professor Alastair Robertson, PVC Learning and Teaching at Glasgow Caledonian University (GCU), to discuss his institution’s approach to capturing the student voice. 

On driving student engagement at Glasgow Caledonian University 

Explorance: The student voice is increasingly being recognized as a powerful tool within Higher Education Institutions. What steps are Glasgow Caledonian University taking to ensure that the student voice is not only heard but leveraged to drive meaningful impact on teaching and learning effectiveness? 

Alastair: We are absolutely committed to listening to the student voice, and have a strong, well-established partnership with our student body. We are currently refreshing our Student Partnership Agreement (which is jointly developed by the University and the Student Association), and this is critical for us in informing change, quality of experience, and bringing an enhancement-led approach to teaching and learning. Module Evaluation Questionnaires (MEQs), distributed through Explorance Blue, are an important part of how we capture student feedback, combined with multiple other approaches to facilitate meaningful exchange. 

Explorance: Closing the feedback loop is an important part of enhancing the effectiveness of responding to the student voice – students need to see what actions have been taken because of their feedback, either from the University or at the academic programme/module level.  

Alastair: It is human nature that if you buy into and invest time in giving feedback, you expect to see the outcome. A recent internal audit at the University reviewed our systems and processes for gathering student feedback and reported that closing the feedback loop has provided an effective mechanism to communicate actions back to students. 

Communication and engagement with students in relation to MEQs were also found to be robust. Students receive formal communications prior to the launch of the evaluation in addition to informal reminders and discussions held in the classroom with module leaders. The audit also reported how GCU utilises social media to engage students in the feedback process using its ‘Student Life’ account. This view was reiterated by student representatives, who were of the view that communication and engagement related to student feedback have been sufficient. 

On increasing response rates on module evaluations 

Explorance: What are some of the measures used to ensure response rates represent the voices of the student population.  

Alastair: Increasing response rates on MEQs is a strategic objective for the University. In trimester B in 2021-22, we experienced a reduction in student response rates, we think partly because of the general weariness in the sector and indeed society at that time due to the pandemic. Here in Scotland, a lot of study was still online due to the pandemic restrictions which eased later than in England. Staff need to build time into class whether they are taking place in person or online for students to complete their surveys – if they leave it to students to do so in their own time, response rates are inevitably lower. 

We are planning a campaign with Explorance for the 2022-23 academic year targeting MEQ survey uptake amongst our students to drive our enhancement objectives. We want to get response rates up to 50% for internal surveys, as these were just 30% in 2021-22 falling from 38% in 2020-21. As part of this, we have already adapted some of the survey scales where needed to provide clarity for our students. For example, on ‘Feedback on my work has been timely’ we have qualified the meaning to ‘has been timely in line with the University assessment policy’. 

Explorance: How do you ensure that what you’re measuring is useful in terms of institutional strategic goals but also national standards?  

Alastair: At GCU, we use NSS scales in our internal module evaluation surveys. MEQs help us to understand the sentiment on the NSS questions and give us a deeper understanding of the NSS. We did very well in NSS 2020 (85%), then dropped to 78% in 2021 and up to 79% in 2022. At my previous institution, we also used module evaluation surveys to educate students to provide constructive feedback. At GCU, we are now considering providing further guidance, for e.g., a template of presentation slides to brief students on our surveys, from practically how to access these to the importance of constructive feedback and the wider benefits of module surveys.” 

On staff engagement in module evaluations 

Explorance: Staff buy-in and engagement in MEQs is critical because the questionnaires are useful for assessing staff performance. However, evaluations are often perceived as negative by staff. What steps does GCU take to engage staff in MEQs? What are the results? 

Alastair: At GCU, we have emphasised that the purpose of the surveys is for enhancement purposes only. Other universities in the sector have been quite explicit that these are also useful for assessing staff performance. 

Positive staff evaluations are very motivating for colleagues. They see that students are recognising their efforts, especially if they are sold in on the enhancement agenda and for development purposes only, rather than as a university stick to beat them with. We look at results across the piece so we can see aspects that staff is struggling with and common classroom challenges that have arisen through module survey feedback. As a result, we can provide appropriate development support. 

Some staff have questioned survey results with lower response rates, however, for enhancement purposes the data can still provide very helpful insights and can be triangulated against other sources of evidence. We also like to look at longitudinal trends to see if there are trending upwards or downwards over more than one year, rather than say, just a cohort effect. The analysis is possible through our data dashboard and will be picked up by managers and heads of departments. 

The recent internal audit of student feedback also highlighted that closing the loop has been largely well-received by staff. It endorses the approach we are taking and provides recommendations on the importance of closing the feedback loop for student engagement. 

On the institutional impact of listening to the student voice 

Explorance: How have MEQs and specifically the student voice helped with strategic initiatives at the institutional level? 

Alastair: We focus on the enhancement with a clear expectation and connection between MEQs and our quality processes and the importance of the student voice. 

Summary reports, which are produced by our Strategy Planning and Business Intelligence team, are sent to these forums each trimester containing the themes arising from MEQ feedback. Key themes resulting from student feedback for any given trimester set of surveys are escalated to senior management and discussed at appropriate forums across the University, including the Executive Board, Academic Policy and Practice Committee, School Governance committees, and the Learning Enhancement sub-committee. The reports include key data at the institutional level, as well as benchmarking between schools and modules. Further, the reports include comprehensive comparisons of results with previous years, as well as the response rates. Interviews undertaken with staff and student representatives as part of the internal audit around student feedback found the reporting to be of a high standard and value the insights it provides. 

One of the biggest institutional impacts of our approach to student feedback and the use of MEQs has been around the NSS questions on Learning Community: 

  • “I feel part of a community of staff and students”. 
  •  “I have had the right opportunities to work with other students as part of my course”. 

Following our performance on these scales in our 2020-21 internal module survey, we made a concerted effort to provide staff with guidance on socialisation and shared good practices. As a result, the score went up by 17 points in our 2021-22 internal survey. It was a real strategic effort, the reward of which can be measured by developments introduced by our central administration team as well as schools. 

On future developments at Glasgow Caledonian University 

Explorance: So, what does the future of student feedback look like for GCU? What are you looking forward to based on what you’ve learned so far?  

Alastair: GCU has worked with Explorance for several years, pre-dating when I came here in April 2020. We have run internal surveys for some time, largely paper-based, but there was a real interest in Explorance given the functionality it affords and how it supports our university community. In 10 years, we should not still be having conversations about the value of MEQ surveys and whether to do them or not. Advances in AI/tech are going to have a significant impact in helping with more sophisticated analysis/understanding of qualitative comments. Also with enhanced data analytics, and the ability to pull in data from multiple sources, we may see more sophisticated Pulse surveys that can be run at any time chosen by staff. 

Read the first in this series “Capturing the Student Voice at the University of Bristol: Pro Vice-Chancellor Perspectives.” 

 

Download your copy of The Student Voice: How can UK universities ensure that module evaluation feedback leads to continuous improvement across their institution? 


BlueCourse evaluationsEducational experiencePeople insight solutionsStudent feedbackStudent insight solutionsStudent voice

Want to learn more about Blue?

Stay connected
with the latest products, services, and industry news.