Feedback

What falling staff survey response rates in HE are really telling us

07 May 2026      Emma Walton-Pond, Communications Officer

Falling staff survey response rates are becoming harder to ignore in higher education.

Our global benchmark data here at People Insight, shows that response rates across the sector have been edging downward. This is not just a story of one or two institutions struggling to get people to take part. The pattern suggests something broader: the reality that participation is becoming harder to earn.

That should make HR and OD teams pause.

When staff do not respond to a survey, they are still telling us something. They may be stretched, they may be sceptical. They may not see the survey as relevant to their role. Or they may have given feedback before and not seen enough evidence that it led to change, so they’ve given up on speaking up.

It’s worth remembering that participation is not just a survey metric. It is a trust signal.

Trust in surveys is shaped not only by the process itself, but by what staff see from senior leaders in the time around it: whose voices are amplified, how openly results are discussed, and whether leaders are seen to engage with uncomfortable messages rather than just headline scores.

 If people believe their feedback will be read, understood and acted on, they are more likely to take part. If they have taken part before and heard little afterwards, participation becomes much harder to secure. Let’s take a look at what’s response rate patterns today and what we can do about it.


The pattern we are seeing in HE response rates

Our benchmark data suggests that staff survey participation is shifting.

The clearest change is not just happening at the lowest end of the data. It is happening in the middle too. In recent years, the typical response rate across HE appeared relatively stable and even rose slightly between 2022 and 2023. But since then, we have seen the middle of the distribution move downward.

That suggests this is not simply about a few institutions experiencing unusually low participation. The “typical” HE response rate appears to be changing.

Early 2026 data is still emerging, so it should be treated with care. But so far, it appears to follow the same direction of travel: lower typical participation and more spread towards the lower end.

The data does not prove one single cause; it does point towards the conditions that make participation harder to secure: pressure on staff time, lower confidence that action will follow, uneven communication between surveys and a weaker sense that the survey is relevant to every part of the workforce.


What staff may be telling us when they do not respond

A lower response rate should prompt curiosity, not assumptions.

One possible message is: “I do not believe anything will change.”

Only 41% of HE staff believe that action will be taken as a result of the survey, compared with 51% across all sectors. For many HE staff, the issue may not be whether they have views to share. It may be whether they believe sharing those views will make a difference.

For many staff, belief in action is closely tied to senior leadership behaviour after the survey. When leaders acknowledge findings openly, explain trade‑offs and stay visible in discussions about progress, confidence tends to increase. When results feel delegated or disappear into internal processes, belief erodes quickly.

Another message may be: “I have said this before.”

Survey fatigue is not always caused by asking too many questions. It is often caused by asking similar questions without showing what has been done with the answers. If staff repeatedly raise concerns about workload, leadership visibility, communication or change without visible progress, the survey can begin to feel performative.

A third message may be: “I am not sure this survey reflects people like me.”

Higher education workforces are not homogenous. A single institution may include academic staff, professional services teams, researchers, technicians, estates teams, catering staff, hourly paid lecturers, student-facing services and colleagues working across different campuses, contracts and working patterns.

If participation is lower in particular groups, the issue is not just the overall response rate. It is representation. We need to understand who is responding, who is not and what that means for interpretation.


Rebuilding participation starts before the next survey

So how can HEIs begin to turn this around? The answer is not simply to persuade harder. It is to make participation feel worthwhile.

That starts with the previous survey, not the next one. Before asking staff to complete another survey, institutions should be able to answer three questions clearly:

  1. What did staff tell us last time?
  2. What changed as a result?
  3. What are we still working on?

The important part is to connect action back to feedback. Staff should not have to infer that their voice influenced change.

This does not mean pretending every issue has been fixed. HE staff understand complexity. They know that universities are navigating financial pressure, regulation, workload challenges and changing workforce needs, so communication should not overpromise. If a theme is difficult, say so. If an action will take time, explain why. If there are constraints, be transparent about them. Let your employees behind the curtain so they feel involved.

Sometimes the most trust-building message is not “we fixed it”. It is: “We heard this. Here is what we are doing. Here is what will take longer. Here is what we cannot change immediately. Here is how we will keep you updated.”


Managers play an incredibly important role in the feedback loop

Institution-wide communication is important, but local conversations often carry more weight.

For many staff, trust is built through their immediate team, school, faculty, department or service area. That means managers need support to interpret results, identify what is genuinely within their control, prioritise a small number of actions and keep people updated on progress.

Without that support, survey results can become another burden placed on already stretched managers. With the right support, they become a basis for better conversations.

Managers play a critical role in local sense‑making, but they cannot carry the weight alone. When senior leaders set clear expectations about action, provide air cover for difficult conversations and model openness about constraints, managers are far better placed to engage teams meaningfully.


Participation planning needs to reflect the whole workforce

A single all-staff message will not reach every group equally.

Participation planning should consider the realities of different roles and working patterns. Academic staff, professional services teams, hourly paid colleagues, estates teams and frontline staff may all need different routes into the survey.


Keep the conversation alive between surveys

The survey should not be the only moment when staff hear about employee voice.

If staff only hear about feedback when leaders want them to complete a survey, participation will always be harder. The listening process needs rhythm between survey cycles.

That could include short progress updates, local action plan check-ins, pulse surveys on targeted issues, staff forums, leadership Q&As, manager-led team conversations or simple “you said, we did, we’re still working on” updates.

Liverpool John Moores University offers a useful example. Before refreshing its survey approach, previous surveys had been affected by slow results sharing and limited visible follow-through. Its later approach placed more emphasis on giving stakeholders access to results quicker, identifying clear areas for action and rebuilding confidence in the process.

Edinburgh Napier University provides another helpful example. Its listening approach includes local Survey Leads, People Partners, university-wide updates, “you said, we did” communication and action planning at both local and strategic levels. Over time, the proportion of colleagues who believed their feedback would lead to change increased from 34% in 2022 to 42% in 2024.

Both examples point to the same conclusion: the post-survey period shapes the next survey’s response rate.


Participation is earned through trust

Response rates are not just a technical measure. They also reflect belief.

Do staff believe leaders want to hear the truth? Do they believe results will be shared? Do they believe managers will be supported to act? Do they believe previous feedback has made any difference?

When the answer to those questions is uncertain, participation becomes fragile.

Participation is earned through trust – and that trust is built through what happens between surveys: visible leadership engagement, supported managers, honest communication and a listening system that treats feedback as something to work with, not just something to measure.

For HR and OD leaders, falling response rates should not be seen as a reason to step back from staff surveys. They should be seen as a reason to strengthen the whole listening cycle: asking relevant questions, making participation accessible, sharing results quickly, supporting managers to discuss insight locally and communicating progress between survey cycles.


People Insight

Jane Tidswell, HE Director

Jane.tidswell@peopleinsight.co.uk




Read more



This site uses cookies and other tracking technologies to assist with navigation and your ability to provide feedback, analyse your use of the site and services and assist with our member communication efforts. Privacy Policy. Accept cookies Cookie Settings