Skip to content.

Tools to combat online harms: protecting children in private messaging spaces

Publication date November 2025

Police data shows that online grooming offences have increased significantly since 2018.1 Part of the risk to children’s safety is perpetrators’ ability to move conversations from public forums, such as social media or gaming platforms, to private messaging spaces or end-to-end encrypted (E2EE) channels, where messages are only accessible to the sender and recipient.

Regulatory gaps in the Online Safety Act also mean that private messaging services have fewer protections for children than public spaces, leading to fears that harms will migrate from public spaces into private ones.

Research was commissioned by the NSPCC to highlight and analyse the existing and emerging technological solutions tech platforms can use to prevent, detect and disrupt grooming and online abuse.

The research was conducted by PUBLIC, a digital transformation partner for the public sector. It included an in-depth literature review and a series of interviews and workshops with 25 experts working across tech, academia, research and civil society.

The research report aims to equip policymakers, regulators and the tech industry with an evidence-based framework that supports strategic, collaborative action to protect children online.

The NSPCC’s response to the report includes a set of policy solutions and suggested next steps for tech companies, Ofcom and the UK Government.

Authors: NSPCC and PUBLIC

References

NSPCC (2024) Online grooming crimes against children increase by 89% in six years. [Accessed 05/11/2025].
Tools to combat online harms: research findings report
Download the report (PDF)
Protecting children in private messaging spaces: NSPCC response to PUBLIC’s ‘Tools to combat online harms’ report
Download the response (PDF)

Key findings

Tools and interventions should tackle every stage of the online grooming process

The report identifies four stages of online grooming:

  • targeting and approach
  • gaining trust
  • trust development and isolation
  • maintaining control. 

Tools and interventions designed to prevent and disrupt online grooming should be used together to target every stage of the online grooming lifecycle.

Online platforms need to share information with each other to keep children safe

Perpetrators of online grooming often move between platforms to evade detection or re-offend. Cross-platform signal sharing allows for information on certain users to be shared between platforms. This would help platforms to detect suspicious behaviour at all stages of the grooming process.

A repository of grooming behavioural indicators could help to better detect signs of grooming

Machine learning models could be trained to detect subtle grooming cues, such as common language used by perpetrators and grooming behavioural patterns. Users would then be notified if a message they received was flagged as unsafe.

On-device safety features can be particularly effective in end-to-end encrypted (E2EE) spaces

On-device solutions, such as safety features that auto-blur nude images or provide prompts to prevent users from sharing personal information, can effectively protect children against different online harms, particularly in private messaging and E2EE environments where safety features within the platform may be limited. 

A systemic, collaborative approach is required to build a more secure online world

Tackling online grooming cannot be the responsibility of a single group or organisation. It requires a systemic, collaborative approach involving government, online platforms, safety tech developers, regulators, device manufacturers, end users and civil society organisations such as the NSPCC and the Internet Watch Foundation (IWF).

Citation

Please cite as:

NSPCC and PUBLIC (2025) Tools to combat online harms: research findings report. London: NSPCC.

NSPCC (2025) Protecting children in private messaging spaces: NSPCC response to PUBLIC’s ‘Tools to combat online harms’ report. London: NSPCC.