Skip to content.

Putting children’s voices at the heart of online safety regulation

A study of user representation mechanisms in regulated sectors

Publication date May 2024

The Online Safety Act 2023 placed new legal duties and responsibilities on online service providers to protect users and keep children and young people safe online. The independent regulator Ofcom is in charge of enforcing this regulatory framework.

Insights from children and young people are crucial in understanding how technological developments impact children, the risks different design features pose and what works to keep children safe.

The NSPCC worked with Baringa, a consulting firm with expertise across multiple sectors, to explore how children’s voices can be represented and play a meaningful role in online safety regulation.

The report assesses the strengths and weaknesses of different ways in which the voices of users are sought, heard and acted upon in regulatory processes, collectively referred to as ‘user representation mechanisms’.

The report provides Ofcom with a set of recommendations on how children’s voices can best be heard and incorporated into online safety regulation.

These recommendations aim to protect and promote the interests of children, making sure children and young people have a meaningful say in decision-making about online safety.

Authors: NSPCC and Baringa

Putting children’s voices at the heart of online safety regulation
Download the report (PDF)

Key finding

There are a wide range of user representation mechanisms used in other regulated settings to make sure users are heard

Other regulatory settings use a range of established mechanisms to listen to users, improve policy outcomes and ensure the biggest challenges to users are addressed. Ofcom should learn from these mechanisms in implementing the Online Safety Act, ensuring children’s voices are heard and acted upon.


Ofcom should make sure there is a designated entity that advocates for children’s online safety

There isn’t currently a statutory body focusing solely on children’s online safety. This means there is a risk that children’s voices will not be represented systematically throughout Ofcom’s work. Ofcom can address this by ensuring there is a designated entity advocating for children’s safety online.

Ofcom should employ a range of mechanisms to listen to children’s voices and factor these into decision making

Different user representation mechanisms come with different strengths and limitations. By drawing on multiple mechanisms, Ofcom can gather a range of insights from children and ensure that the weaknesses associated with one model do not distort the decision-making process.

User representation mechanisms must be tailored to children specifically

For children to meaningfully contribute to complex policy issues, the ways in which their voices are heard and sought must be tailored to them and their needs. This is often achieved by moving away from traditional means of user engagement and adopting more innovative approaches.

Ofcom’s research programme should continue to harness a range of techniques and aim to reflect the broad range of children who use online services

Ofcom is using a range of research methods to understand children’s online behaviour. It should continue to use innovative techniques to reflect the full range of children’s experiences, especially those children who are underrepresented.

“Young people should be at the forefront of decision-making. We are the experts in our own lives. It must be Ofcom and tech companies’ duty to listen to young people and our experiences.”

NSPCC Young People’s Board for Change


Please cite as: NSPCC and Baringa (2024) Putting children’s voices at the heart of online safety regulation: a study of user representation mechanisms in regulated sectors. London: NSPCC.