What does it take to encourage students to use their voice?

By Eliza.Compton, 10 June, 2024
The National Student Survey tells us students want their voices heard, but what if they don’t take up opportunities for feedback? Sam Perry looks at challenges of integrating student input (and wonders if chocolate is the answer)
Vertical
Article type
Article
Main text

The theme of “student voice” is a key target in curriculum design. Its inclusion helps students feel engaged with course material and more integrated with the university as a whole. However, in the UK, the National Student Survey (NSS) has shown this to be an area where many courses are lacking, with only 74 per cent of students believing their opinions are valued by staff, and 61 per cent feeling that their feedback was acted on.

To tackle this at the University of Southampton, we designed a feedback scheme as part of the undergraduate chemistry teaching labs. Students complete one experimental session per week and can fill out a short, anonymous online feedback form at the end. Our motivation was to give students the opportunity to provide in-the-moment feedback, so their suggestions could benefit them immediately rather than impacting only the following cohort. Students in the lab class rotated through 10 practical stations over 10 weeks, so feedback from week one could improve the following nine rotations, offering a clear benefit to the student experience.

This plan was well received when pitched to students, and a pilot scheme was run with a first-year undergraduate cohort. Unfortunately, uptake and retention of the feedback scheme was disappointing, with 34 per cent of the cohort responding to the survey in the first week and dropping to 8 per cent after just five weeks. 

Discussions with students revealed conflicting views towards the survey. Virtually all believed that the opportunity was useful and should be kept, but the same students acknowledged that they did not regularly submit any feedback. Even students who admitted to having never interacted with the survey still believed it should be continued into the next academic year.

So the challenge now seems to be how to give students a voice in a way that encourages uptake and retention. Discussions and anonymous surveys from students who took part in the pilot scheme presented the following opportunities to promote engagement and retention.

1. Ease of access

The feedback opportunity was presented as an online Microsoft form. Students were given a direct link to the survey as part of their lab script, and a QR code for access was displayed prominently around the lab. Despite this, the most reported reason for not completing the survey was “forgetting”. One student suggested that a link to the feedback form could be included in the lab script, not realising it was already there. 

Prominence of access is clearly important but so too are regular reminders. Students reported that they would have been happy to be more regularly reminded, with more prominent placement of physical reminders in learning materials and the virtual learning environment (VLE), as well as verbal reminders. This would be particularly important in the early weeks of the scheme to cement providing feedback as a positive habit in their lab routine. 

2. Integration with the course

Students have many tasks on their minds, and at the end of a lab session they are preoccupied with the data they have to analyse and the report they must write, as well as social plans and other responsibilities. The feedback opportunity was pitched as something to do as they were leaving the lab, which presented a conflict with upcoming responsibilities and social interactions.

The end of a lab session is often seen as the end of lab work, and so feedback as an additional task can fall through the cracks. A solution is to position the survey as part of the lab day, rather than a separate entity. This allows it to be completed while students are still focused on laboratory learning.

3. Dissemination of outcomes

The feedback survey offered students the chance to suggest improvements to the course design. Many of these were minor and helpful, such as where lab scripts could be clearer or extra resources to aid understanding. These were acted on but still presented a question about perceived impact of the feedback since students who came to the practical after a rotation experienced the improved resources without comparison and would not see their own suggestions being implemented. 

To address this, students suggested regular dissemination of outcomes from the student survey. This could be verbal, in the form of an announcement at the start of a session, or a published list on the VLE. In this way, proof of the impact of their feedback can be shared and will have a positive effect on uptake and retention. 

4. Opportunity for discussion

Many students reported not filling in the survey because they could not think of anything specific to comment on or suggest for improvement. The sense that “everything is fine” led to a reduced motivation to complete the survey. However, when opening discussions about the feedback pilot scheme, these same students responded enthusiastically to suggestions from their cohort, building on the ideas of others and leading to a dynamic discussion.

This highlighted a missed opportunity in the survey that focused on anonymous opinions. Integration of discussion in feedback seems to be vital in providing a more complete view of the student voice. This encourages participation from students who may incorrectly feel they do not have an opinion worth sharing, and builds on suggestions to provide more exciting and actionable contributions from the wider student body. Discussion could also be incorporated into the actions based on student feedback. For instance, if feedback reported that an instruction was not clear, two or three alternate phrasings could be presented to the cohort, and a group discussion could improve on the one that is implemented. 

5. Reward

A simple suggestion, and the most enthusiastically received in student discussion, was incentivising completion of the survey. This is linked to point 3 (Dissemination of outcomes), where the benefits of feedback are seen as the “reward” of a tangible change, although student suggestions for rewards were, perhaps unsurprisingly, more focused on graded rewards and chocolate. In either case, including a reward can be a useful motivator, if only to encourage initial uptake and to encourage the habit of taking part in the feedback opportunity. 

The recurring question is how to integrate student voice with both inputs and outputs of feedback. Presenting feedback as part of the course design rather than as an after-class activity more seamlessly integrates it as a classwork task, minimising the chance of it getting lost among other after-class responsibilities. Involving students in discussion around, and implementation of, recommendations from feedback provides them with tangible rewards for their efforts. 

If all else fails, there’s always chocolate!

Sam Perry is a teaching fellow in the School of Chemistry in the Faculty of Engineering and Physical Sciences at the University of Southampton.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Standfirst
The National Student Survey tells us students want their voices heard, but what if they don’t take up opportunities for feedback? Sam Perry looks at challenges of integrating student input (and wonders if chocolate is the answer)

comment