Only 9% of girls report feeling safe in online spaces.1 This report examines how online services and platforms can be used by perpetrators to identify young female users and target abuse at them. Abusive communication can take the form of sexual harassment, technology-assisted child sexual abuse, bullying, threats and hate speech.
The research set out to identify the design features of online platforms that can facilitate or promote abusive communications with young female users. It explored the design of ten platforms, using fake accounts for a fictitious 14-year-old girl user. The platforms included popular video-sharing, social media, gaming and messaging services.
A combination of methods was used to understand the interaction between service design, perpetrator tactics and the targeting of girls.
- ‘Typical’ user journeys were mapped, documenting the design features that a girl user encounters when registering an account, editing her user information and engaging with other users.
- Relevant literature was scanned to establish what is known about the online victimisation of girls and the behaviours and motivations of those who engage in abusive or harmful communication online.
- Virtual interviews were conducted with eight experts in platform design, cyber safety and security and children’s online experiences and protection.
- Fake adult accounts were used to explore how discoverable the fictitious 14-year-old girl was on the different platforms, whether adults could contact her and whether platforms currently do anything to reduce the risk of inappropriate or abusive communication.
The report urges the Government and technology companies to put measures in place to mitigate unsafe design features and protect girls on digital platforms.
Authors: NSPCC and PA Consulting
References
Plan International (2024) State of Girls Rights in the UK (PDF). London: Plan International UK.