Why are response rates to student surveys so low?
The Postgraduate Taught Experience Survey (PTES) was introduced by the Higher Education Academy in 2009. Its purpose was initially to provide data to Higher Education Institutions (HEIs) to help them improve the quality of the education they provide. However, it is likely to be used in the future to create league tables of HEIs for prospective postgraduate students, in the same way that the National Student Survey does for undergraduates.
The PTES suffers from a low response rate, which diminishes our ability to generalise from its findings. Although it rose to 25% in 2012, previous years have recorded a response rate no higher than 18%. Even with one in four postgraduate students responding to it, we know nothing of the views of the other three-quarters. If it is to be used in the future to inform national league tables of provision for postgraduate students, then the data needs to represent the views of the majority, not the minority of students.
Concerned about this, I undertook a study in 2011 which sought the views of students at King’s College London who did not respond to the PTES. The findings have just been published by the journal Educational Research. A limited number of full-text copies of the paper are available on the publisher’s website. But if you are unable to download one of these, please contact me and I’ll email it to you.
Funded by a small grant from the Health Sciences and Practice Subject Centre of the Higher Education Academy, I led a study of postgraduate students from four health faculties at King’s College London. 355 completed a short online survey to explore the reasons why they did or did not respond to the PTES in 2011. Themes arising in the survey were discussed further in focus groups of respondents to our survey.
40% of respondents to our survey did not complete the PTES. Of these, only 30% stated that they knew what the PTES was for. Not that many more – only just over half (54%) – of those who completed the PTES stated that they knew what its purpose was.
We analysed this data further to see if this pattern could be explained by other factors. But we found that those who did not complete the PTES were less familiar with the purpose of the PTES than those who did, a difference unlikely to have been caused by chance or other variables which were also related to non-completion of PTES.
By itself, this is an unremarkable and common-sense finding. However, we also asked students about the impact of the first two versions of the PTES on educational provision for postgraduates. We expected to find that those who did not complete the PTES thought that it had less impact on educational provision than those who did complete it. However, the difference we found was not statistically significant. This suggests that if students had more information about what the survey is for (to improve the quality of educational provision for postgraduate students) then it is possible that more will complete it in 2013 and future years.
The focus groups confirmed these findings but also generated other reasons why students did not respond. Perhaps the most important of these was a general feeling amongst participants of isolation and a perceived lack of community amongst postgraduate students. This could be unique to these students, or it may be a more general problem. They confirmed, though, that if incentives were offered to complete the PTES it would be likely to increase the response rate.
As PTES becomes better-known, it is expected that response rates will rise. However, we should not rely on this and instead ensure that all postgraduate taught students understand why the survey is conducted each year and what each HEI will do with the feedback they receive.
The survey is open now, so now is the time for all of us working in HEIs participating in the PTES to tell our students what it is and why they should complete it. A higher response will provide more accurate feedback to us, enabling us to make improvements where they are needed the most.