Monday 20-10-2014 - 10:39
A crucial aspect of student engagement is how well students engage with their own learning. It’s argued that we should measure this, rather than student satisfaction, through mechanisms like the National Student Survey.
Kate Little, NUS quality and student engagement consultant, explores the pros and cons.
A key debate in the field of student engagement is around what aspects of students’ experiences we measure, and how to use the resulting data to enhance the quality of courses and maximise students’ ability to learn effectively.
Currently, we focus on measuring students’ satisfaction with different aspects of their courses, asking them to rate things like their teaching quality or library provision from 1 to 5 in the annual National Student Survey (NSS).
As a method of student engagement itself, the NSS is pretty poor: students are asked to sum up their entire academic experience in 23 pre-set questions and two open-text comment boxes, allowing them to detail positive and negative experiences.
The real value of the survey is its utility in starting conversations between staff and students about what the results say, sparking concrete actions to improve students’ experiences and allowing students to engage in a genuine dialogue about their course.
Moving towards engagement
Recently, the idea of replacing satisfaction surveys with engagement surveys has gained a lot of traction. This would mean a shift from asking students how satisfied they are to asking them how much they participate in various aspects of learning, how challenged they feel and what skills they have developed.
Part of the reason for this is the increased focus on NSS data under the government’s student choice agenda, and a resultant feeling that too much emphasis is placed on arbitrary satisfaction scores that give little useful information to prospective students.
Would a score of 79 per cent satisfaction with personal development at one university compared to 86 per cent at another actually help students to make informed choices?
It is argued that surveying measures of engagement offers prospective students a more nuanced and accurate picture of what it is like to be a student, both better helping them to make choices and getting them in the right mind-set even before they arrive.
After all, knowing whether your course will challenge you, what skills you will develop and the quality of your interactions with staff are things most prospective students really want to know.
However, one thing that is seen as extremely useful by those researching potential courses is the opinions of students who have studied that course; this is one thing that satisfaction surveys capture but engagement surveys may not.
Informing potential students is one crucial aspect of student engagement is one side of the student survey ‘coin’. The other key purpose is to provide data to universities to enable them to improve the quality of their courses based on student feedback.
This is where a full shift towards engagement-style questions may result in less valuable data than we currently receive; although we gain an understanding of how much students are putting into their learning, we lose the ability to identify what they like and don’t like.
Batting back the blame
Another potential worry about moving fully over to an engagement survey is the potential for students to be blamed for their own poor educational experiences: no wonder you’re not happy with your course – you only came to half the lectures, or you never made the effort to interact with staff!
Bad student! Obviously there is an onus on students to engage in their own learning, but shifting the blame wholly onto the student seems unfair. What if the library provision was inadequate or the lectures were uninspiring?
There is a danger that institutions that do not fully buy into the ethos of student engagement may find it easier to pretend these questions do not exist and blame student apathy.
At NUS, we’ve recently responded to a consultation reviewing the future of the NSS and have fed in many of the arguments above. We’re not against questions on students’ engagement in their learning, but we don’t want to replace all the satisfaction-style questions with engagement ones.
Having a balance between the two would allow students to express their opinion on their course, while allowing a more complete picture of students’ experiences to be captured.
The important thing is that the data continues to be used to start conversations between staff and students, and drives positive changes to students’ lives.
You can read NUS’ full response to the consultation, or email Kate at email@example.com.