University of Worcester
Approximately 10,000 students
Historic reliance on paper surveys, limited closing the loop and reporting capabilities, and manual and resource-intensive data management processes which were highlighted even more during Covid lockdown.
- Higher response rates.
- Improved reporting, and closing the loop capability.
- Greater ownership for academic schools in survey management.
- Flexibility for a range of institutional surveys.
- Integration with multiple university platforms.
Moving away from paper-based surveys
The University of Worcester, which is ranked in the top five in the UK for Quality Education in Times Higher Education’s University Impact Rankings, delivers degree programmes to over 10,000 students across nine academic schools. In summer 2021, the University appointed Explorance to provide student module evaluation systems and associated services, after the Covid-19 pandemic “crystalised the need to have a strong online presence in surveys”.
“Like a lot of institutions we had multiple survey platforms with a main platform that was historically very paper-based with limited access to data and a lot of manual spreadsheets and CSV uploads,” said the University’s Student Surveys Manager Carolyn Moir. “We did not have a lot of live data, and therefore not a lot of ownership from users and students, because the surveys and reports were not very visible. We also did not really have any closing the loop capabilities with very simplistic reporting in place.”
“We were already looking at reviewing our survey platform when Covid hit so pretty much overnight we needed to go from two academic schools using some online elements to all of our schools going online. It exposed some real challenges around access and participation in the survey process, so this started to shape my thinking about what we wanted to do next in terms of reviewing our platform and how we wanted to approach survey management in the future. From there we put together our wish list.”
Improving the user experience: better communication with staff and students
The University issued a tender through the APUC (Advanced Procurement for Universities & Colleges) framework – and top of mind was a commitment to improving the user experience and the value that brings for people.
“Our goals were to move to a much more seamless and integrated solution to give academic schools a lot more input in terms of managing their surveys, ownership of the process, and ultimately to have better data,” Carolyn explained. “More accurate data going in, therefore better data coming out, and improving results reporting. The other thing we wanted to do was improve communication with staff and students on surveys and focus on our closing the loop capability. Previously this was very manual and completely outside the survey system so it relied on people doing something and feedback getting back to the students. We wanted that single point of truth, so people know that if they want their information on their surveys, they can see exactly where to go as it is all held in one place.”
The intention was to move away from this “largely invisible process” to something a lot more transparent and easier to manage. She said: “In order to do that we had to integrate the chosen solution into our student record management system SITS, VLE Blackboard Ultra, and portal solution MyDay. We also wanted to create a feedback dashboard so that we can have centralised reporting and access to that data over time. Explorance’s Blue does all of this, and they were our first choice partner to go with.”
Rising response rates and improved results reporting
An early feedback programme level survey in October 2021 was soon followed by an apprenticeship employer survey a month later, and then 450 module evaluations were completed before January 2022. Through these surveys the University’s internal reporting structure was set up, meaning reports were made available as appropriate to the level of staff roles, and course leaders could respond more quickly.
“An immediate impact was that VLE students were able to access the survey straight away,” Carolyn said. “Before we had to rely on students clicking on a link in their email, you could not share them with anyone as they were all individualised, so with the first survey we saw an improvement in response rates and picked up 1,000 more responses than we had in the previous year. Then in our bigger project we got a 32% response rate which was better than we were getting in the old system for online modules, and within that about a quarter of the modules were getting a 50%+ response rate.”
As of February 2023, further progress had been made. “Our response rates to our early feedback survey increased from 32% to 36%, our first semester module evaluations were 42% against 32% last year, and our semester one closing the loop to date had reached 60%,” she revealed. “We are also running our module evaluation pilot with a couple of schools (using a single survey across the course rather than for each module) which has a 30% overall response rate so far, with one pilot group achieving a 52% response rate. Again, the value of Blue is it allows us to try these things out fairly easily.”
Carolyn added: “From a staff perspective we are now able to provide results directly through Blue at individual course level, rolled up at school level and then at overall university level, and we are able to provide those very quickly. Module leaders have ownership in terms of adding their own targeted questions through question personalisation, so they feel they are getting some valuable data for their own purposes as well as the general module questions. Overall, the survey is much easier to set up, manage triggers like specific questions for certain students e.g. nurses/apprentices, and set up those filters.”
Developing closing the loop capability
Carolyn is now prioritising further work around closing the loop. “Whilst we have made good progress, this academic year we are introducing a second element for staff into the closing the loop survey,” she explained. “There will be one question which is ‘tell your students your response’, and we will publish that to the students. Then the second question which is ‘please reflect on your experiences’ so we can capture that information and they have a record of what was said and which allows them to keep information in one place. Everything we want to do is supported by using Blue’s data sync, which is incredibly useful for maintaining accuracy.”
She concluded: “The whole point of this entire process is to support the evaluation and enhancement aspect of what we do, so if we are able to develop something a little creative along the way that still meets those goals then Blue gives us the opportunity to do that and that is really important. We are launching our evaluation dashboards this year so we will be able to compare year-on-year/semester-by-semester and give people different and better information than we have ever had previously. From where we started to where we are now it has been a really significant change, it has definitely been a big project, and a huge evolution in how we use and manage our surveys.”
Blue•Case study•Course evaluations•Educational experience•Higher education•Student Success•