Skip to main content

Office for Students (OfS): Consultation on changes to the National Student Survey: Analysis of responses and decisions

The National Student Survey (NSS), introduced in 2005 and completed by about 300,000 students annually, is the only census style survey of final year undergraduates that covers the whole of the UK. The survey has been reviewed and tweaked several times since its inception. In 2020 the government ordered a “root and branch” review of the NSS after raising concerns about the “unintended consequences” of the exercise. Led by the OfS on behalf of the four funding and regulatory bodies across the UK, a consultation outlining a series of proposed changes was launched in July 2022. It received 250 responses over its five week duration. The Analysis of responses and decisions document covers those responses and the rationale behind the final decisions.

The full report can be found here.

At-a-glance:

  • The National Student Survey will change to direct questions, using a four-point answer scale rather than the current five-point scale. This is to “ensure the questions are well understood and interpreted consistently by students” (p10)
  • A new question on mental wellbeing will be added. It will ask students about how well institutions have communicated information about mental wellbeing support services  (p29)
  • The current standalone question on ‘student satisfaction’ – Q27 – will be removed in England to “provide a stronger focus on questions relating to aspects of quality”. However, this question will be retained, in its current wording, in Scotland, Wales and Northern Ireland where it is used for regulatory purposes (p17)
  • Most of the responses to the consultation disagreed with the proposal to remove Q27 for England. Many thought this was a useful question that enabled students to feedback holistically on their experience. Many providers use it as a benchmark and for quality assurance, and would like to retain it (p17)
  • OfS argues that questions providing “granular detail” allow it and providers to explore where students are positive or negative about different aspects of course quality. As Q27 does not relate to conditions of registration, it does not tell the OfS about the aspects of quality that it regulates (p20)
  • A new question on freedom of expression will be added for England. It will ask how free students felt during their studies to express their ideas, opinions, and beliefs  (p23)
  • The NSS will be reviewed every four years with scope for additional reviews as appropriate, to help ensure the survey remains fit for purpose and that it reflects current practice while avoiding any disruption that unplanned reviews could cause  (p34)
  • From 2024-25 the period for students to respond to the survey will be shortened so that it runs from mid-February to April, rather than the current period of early January to April (p36)

Implications for governance:

For 17 years the NSS has provided robust information that has informed student choice, driven improvements in teaching and learning and allowed comparisons across the whole of the UK higher education sector.

Despite several reviews in that time, the questions and survey have remained largely consistent. The current review ushers in a comprehensive rewrite that is likely to impact a number of areas that governors are concerned with such as, measuring student satisfaction and wider experience, quality assurance systems, marketing and reputation.

Arguably a substantive change is the removal in England of Q27 which provides a measure of overall student satisfaction levels. The question has become an important feedback tool for institutions to gauge how, overall, each cohort feels about their academic experience, allowing comparisons with previous years and with different institutions across the UK. Champions of the NSS argue that the metric has been used by some to inform changes in the curriculum, teaching and learning quality, learning resources, and academic support and been integrated into institutions' marketing strategies: although the review questions to what extent such changes are directly linked to Q27. Certainly it is the NSS metric that is most widely used by third parties in performance and league tables.

Responders to the consultation were concerned about the impact of the omission of the question. Some responses highlighted a possible negative impact on the international reputation of UK higher education. Responses from GuildHE and the Quality Assurance Agency (QAA) also questioned the value and possible unintended consequences of creating difference between the nations.

Around half of respondents to the consultation suggested that prospective students – and in particular those outside the UK – would have reduced information with which to make comparisons across all institutions in the UK – which some consultation respondents felt could draw negative conclusions about an absence of data for that particular question in England. Another concern expressed by some in the consultation is that less robust datasets could emerge to replace the summative question. According to the OfS, however, these concerns are outweighed in England by the need to ensure clear links between the information provided by the NSS and the aspects of quality that are subject to regulation in English providers.

Governors may want to consider the extent to which their institutions employ the results of Q27 in plotting student satisfaction over time and how this data is used in quality assurance. It may be that, for institutions in England, other survey questions and responses can be used as substitutes, such as those covering quality of teaching and academic support. Other nationwide student surveys can also provide important and insightful metrics to help drive the enhancement of provision quality, such as Advance HE’s UK Engagement Survey (UKES). At Postgraduate level, Advance HE’s PRES & PTES surveys will continue to include a summative question covering “overall satisfaction” within their broad scope of question areas.

Questions added to the NSS have both the potential for reputational risk and for providing useful feedback to drive improvements in the student experience. The new mental health question which asks students how well institutions communicate information about mental wellbeing support services could provide an important baseline against which institutions can measure the success of their signposting, messaging and support, for instance.

The addition of a currently topical question on freedom of expression could also pose a reputational risk, particularly given the Higher Education (Freedom of Speech) Bill, which is currently at committee stage in the House of Lords. It will ask students in England about the extent to which during their studies, they felt free to express their ideas, opinions, and beliefs, on a four point scale from “very free” to “not at all free” (with an additional “this does not apply to me” response). Governors may wish to refer to the recent sector statement on promoting academic freedom and free speech.

Concerns have been raised in consultation responses about the move to a more direct form of questioning, rather than providing statements that respondents can agree or disagree with. Part the OfS rationale for the change is the tendency for agree/disagree scales to produce acquiescence bias, where respondents are more likely to disproportionally agree with a statement in a question.

For institutions, the move means a loss of trend data which allowed year-on-year comparisons between different versions of the survey, which many used as evidence for the Teaching Excellence Framework, quality assurance and enhancement mechanisms. Some respondents to the consultation took the view that employing a series of different response scales, as the new questions do, rather than one set of “agree or disagree” scales, would make analysis more challenging and harder to interpret, and could undermine efforts to draw conclusions across questions.

Governors will also want to note the shorter timeframe within which student responses will be collected, and perhaps refer to the new list of NSS questions (see Annex D: Final questionaire). Measures might have to be considered by institutions to ensure the publication threshold of a minimum 50 per cent response rate is reached in this shorter window.

Keep up to date – sign up to Advance HE communications

Our monthly newsletter contains the latest news from Advance HE, updates from around the sector, links to articles sharing knowledge and best practice and information on our services and upcoming events. Don't miss out, sign up to our newsletter now.

Sign up to our communications
Resource type: