Skip to content.

Podcast: An introduction to Report Remove, an online self-reporting tool for young people

Last updated: 21 Mar 2023 Topics: Podcast

Learn more about how to incorporate the tool into your safeguarding response to incidents of sharing nudes

Report Remove is an online tool that under-18s can use to report sexual images or videos of themselves that have been shared online, to see if they can be removed from the internet. Developed by experts from Childline and the Internet Watch Foundation, the tool provides a child-centred and non-judgemental approach to image removal.

Listen to this podcast episode to learn more about why a tool like Report Remove is needed, how the tool works, and how you can signpost young people to the tool as part of their response to incidents of sharing nudes.

The discussion also covers:

  • why a young person might share nude images online
  • important things to remember when responding to an incident of nude image sharing, including not to view the image themselves
  • how reports are handled when they are made, including the security measures that are in place to ensure confidentiality
  • how Childline supports children who need to use Report Remove, both before and after a report is made
  • what happens if a reported image doesn’t meet the legal threshold for removal.

Listen on YouTube


About the speakers

Samantha Firth has been involved with Childline since 2006. With a background in child development and training, she was initially a volunteer Childline counsellor, becoming a Childline counselling supervisor in 2013 and then moving into the Childline Online Service in 2020.

Zara is a Senior Content Analyst who has worked at the Internet Watch Foundation for over nine years. She spends each working day assessing online images and videos of children suffering sexual abuse and removing them from the Internet.

NSPCC Learning Podcast

Our podcast explores a variety of different child protection issues and invites contributors from the NSPCC and external organisations to talk about what they are doing to keep children and young people safe. Use our episode directory to browse through all our episodes to date.

Don’t forget to subscribe to our podcast through Audioboom, Apple Podcasts, Spotify and YouTube or sign up to our newsletter to hear about new episodes.

Resources mentioned in this episode

> Find out more about the Report Remove tool and download accompanying print-outs

> Watch an introductory video for professionals on how Report Remove works

> Access the Report Remove tool on the Childline website

> Take the Managing incidents of sharing nudes elearning course to learn more about the topic


Welcome to the NSPCC Learning Podcast, where we share learning and expertise in child protection from inside and outside of the organisation. We aim to create debate, encourage reflection and share good practice on how we can all work together to keep babies, children and young people safe.

George Linfield (Producer):
Welcome to the NSPCC Learning Podcast. This episode focuses on Report Remove, an online tool from the NSPCC and the Internet Watch Foundation that under-18s can use to report nude images or videos of themselves that have been shared online to see if they can be removed from the internet.

In just a moment, you'll hear from Chloe O'Connor, Projects Manager within the NSPCC's Child Safety Online Solutions Lab, in conversation with online safety practitioners from Childline and the Internet Watch Foundation. They will explain more about what Report Remove is and how it works; the confidentiality and security around the tool; how professionals who work with children should approach an incident of nude image sharing; and how professionals can use Report Remove to support young people involved in these incidents.

But first, let's hear from two members of the NSPCC's Young People's Board for Change about why the Report Remove tool is important for children and young people.

Young People’s Board for Change Member 1:
I think it's incredibly important that young people have a tool like Report Remove because not every young person has a support system like their family or even trusted teachers at school to confide in them about these sort of things.

I think there's a lot of fear in general about these situations being forced to go to the police and a massive deal being made out of a situation that a young person probably already regrets and, like, isn't very happy about in the first place.

So I think the fact that Report Remove is confidential and is easily accessible for so many young people is absolutely amazing and incredibly important.

Young People’s Board for Change Member 2:
The fear of judgement around sexual images for young people is a massive problem that can cause lots of mental health problems among young people. And I think that this is why this service is so important.

And I also think it's really important that it's linked to Childline so that young people then also know that if they need further support or if they want to talk about it, they can then easily access that service afterwards as well.

Chloe O’Connor:
There are lots of reasons why children and young people might share nudes, but it can be incredibly distressing for young people when they lose control of a nude image that's been shared online. It can leave them at risk of further abuse or exploitation, including financially or for further images. Young people can feel revictimised every time an image or video of them is shared, and the IWF are continuously seeing an increase in self-generated child sexual abuse material.

To address this issue, the NSPCC partnered with the Internet Watch Foundation, or IWF, to find a way to support those young people to report nude and sexual images and videos of themselves to see if they could be removed from the internet. With support from age verification platform Yoti, the first version of Report Remove was developed in 2017, and then since then the organisations have been working with each other and with children and young people to improve the tool and to make sure that young people know about it.

So to make a report, children need to follow three steps. The first is follow the instructions to confirm their age. If they're 13 to 17, they'll be asked if they'd like to prove their age using Yoti and using a form of ID to help with that.

And then step two, they'll log in or create a Childline account so that they can receive updates on their report.

And then step three: report and remove. They share the image or video or a link to it securely with the IWF, who will then view and work to have it removed if it breaks the law. Childline will let the young person know the outcome of their report and provide further support where needed.

In this episode we'll be joined by Zara at the IWF and Sam from Childline's online service team to talk about how Report Remove works and how professionals can support young people to use it. I'll now pass over to them to introduce themselves.

Hi, I'm Zara and I'm a senior analyst at the Internet Watch Foundation. The Internet Watch Foundation is an independent charity based in the UK. We work to make the Internet a safer place. We identify and remove online images and videos of child sexual abuse worldwide.

We also run a hotline which offers a safe place for the public to report to us anonymously. The hotline consists of a team of 16 expert analysts who spend their working week assessing and disrupting the sharing of child sexual abuse material online.

Sam Firth:
My name's Sam and I'm a website supervisor with the Childline online service. Childline is a service for children and young people under the age of 19 in the UK.

I think it's most well-known for its Childline counsellors and the support it can offer over the telephone. But also young people can make an account on the website. The website itself has a wealth of information and tips and advice on a whole range of subjects and topics. It also hosts the Report Remove tool.

Brilliant. Thank you very much. So it would be really good to hear more about Report Remove and why it's needed. But taking a step back first, because Report Remove is there to support young people who have had a nude image shared online: what should professionals do and how can professionals who work with children approach an incident of nude image sharing?

I think the primary things to remember is to not look at the image yourself and to follow any safeguarding policies that your organisation may have. In terms of interacting with that child or young person that has come to you with this problem, it's really important to stay calm, remain non-judgemental, and accept the situation for what it is. It may well be that this young person is experiencing a sense of regret and is aware of the risks they might be now facing. So, there's no need for telling off or attempts to educate them on the risks because that's already happening for them.

I'd just like to add that it's important to remember that the creating and sharing of nudes of under 18 is illegal. However, the law is there to protect children. So it's important to reassure that young person that they're not in any trouble and just to use supportive language and that there is a tool out there to help them remove the content online and the law is there to support them.

One option is to tell that child or young person about Report Remove. And if you've got access to the internet there and then it'd be really handy if you could show them the tool on the website by searching for the URL:

So why is the Report Remove tool needed and why might a young person have shared nude images online?

It's a normal and expected and healthy part of child development to want to understand the changing body. And then when we couple that with access to the internet, many children and young people have a phone with a camera with that internet access. Perhaps to an extent, it's almost inevitable that those two things are going to come together at times.

There's many reasons why a young person may share a nude. They may be sharing their images in a trusting friendship or a trusting relationship. It might be an expression of body confidence. It might feel empowering. They might be doing it for fun. There's a darker side too. They might be sharing nudes because they've been pressured or they've been coerced or they've been bullied.

But they might change their mind about what they've done. They may experience that sense of regret. It may be that trust has been broken or they're concerned that that trust might get broken. Or when they've initially shared a nude to try and prevent further requests, further pressure, it hasn't worked, and that's when they feel they need something to happen, something to change, so they can take some control back of the image that they've lost.

That's where Report Remove can offer a solution and can offer safeguarding.

Thank you, Sam. So, Zara, you spoke at the beginning about how the IWF can identify this kind of content and take it down. Why is there a need specifically for a service like Report Remove?

Over the years we've seen a growth in the sharing of nude images online of under-18s, so we thought it'd be a good idea to provide a service that not only helps young people access support when things go wrong, but also to be able to use our expertise and our contacts within the internet industry to block these images and stop them from being shared online.

Who is Report Remove for?

Report Remove is for children and young people in the UK who are under the age of 18.

And if a young person wanted to use the tool — they've looked for it online and they've found the tool — what would they then need to do?

So once they get onto the Report Remove page on the Childline website, from there they create or sign into a Childline account. Then they have the option of proving their age with I.D. such as a passport.

Once they've gone through that stage of the reporting process, they need to attach the image, the video or the direct URL for the image or video to the report, and that will go to IWF who will begin their work with it.

Childline never ever see the image or video, but we will make contact with that child or young person to keep them updated with how the report is going and offer them other forms of support as well.

Why are young people asked if they'd like to prove their age?

So children under 13 are not asked to prove their age. That's because our analysts have the visual expertise to assess that person in the image is underage. However, young people over the age of 13 are asked to choose to prove their age using I.D., because that means the IWF can be certain that the image of the person is under 18, and then therefore we can get the image taken down from a lot more places.

But if a child does not have I.D., they should still be encouraged to use Report Remove. This is because the IWF will still make an assessment of the age of that person in the image and in many cases will be able to use this assessment to be certain that it is a child. If IWF can't be certain that the content is of a child, then we can still ask tech companies to take it down. This means that the content can still be removed from lots of places and the young person can still choose to access emotional support from Childline.

From the IWF's perspective, what happens at the point that a young person's made a report?

So as Sam said, the young person will create an account. So once they've created their account, they'll be directed to the secure IWF portal to upload their content. After the report has been made, the person will receive a case number into their locker and they can refer back to this if they have any more communications to make about their report. So apart from selecting their age range, the only information the portal gives to IWF are their images and videos or URLs and these are, like I said, uploaded via a secure portal, so only the analysts here will see their content.

Unfortunately, IWF cannot view content on end-to-end encrypted apps or websites such as WhatsApp and Snapchat, or content that's saved on another person's device. But if the young person still has the image, they should be encouraged that they can still make a report and upload those images to us, and then we will make sure that it's assessed against UK law. And if it's found to be criminal, we will get that removed as quickly as possible.

That's really helpful. Thank you. And what happens if the image has been changed slightly? For example, if somebody has got a cropped part of the image?

The technology that we use, it doesn't matter if the image has been doctored or in any way or cropped, as you say, we'll still be able to identify that image using the hash that we give it.

That sounds really important. Do we know anything about how young people have responded to Report Remove and if it has made a difference to them?

So I can give you an example of a 15-year-old male who's used Report Remove. So this person reported a number of images and videos, a mixture of images and videos. IWF assessed them all to be child sexual abuse material and therefore meeting the threshold to be removed from the internet. So then, as Childline, we were able to feed that back to the young person and offer some emotional support via the counselling team — which they took that up and they were supported by our Childline counsellors for a period of about three months.

They explained to our counsellors that the images and videos were sent to a female young person in trust. But then having been in school and reflecting on those general messages about online safety and nudes, they recognised their vulnerability and started to become a bit concerned about how that material might affect them in the future. And they told that Childline counsellor that they spoke to about the sense of relief that they had when those images and videos were removed and offering them future safeguarding as well.

And actually after making their Report Remove report, that same young person got in touch with the platforms they shared those images and videos on and asked that platform provider to remove all other data as well, which the platform agreed to do.

Yeah, so any sense of regret and anxiety, and also a feeling of 'life was over' that that young person was feeling; they were feeling now more reassured, more relieved. So Report Remove was really powerful for that young person.

Thank you for that example of the positive impact that the report tool can have Sam. Zara, you mentioned earlier that you'd be assessing the content against UK law to see if it could be taken down. What happens if you aren't able to take the content down?

We will tell Childline the outcome of our assessment and if it doesn't meet our criteria of breaking the law, then we will let Childline know why it doesn't quite meet our threshold and then they can convey that to the child and offer support in other ways.

Great, thank you. And then Sam, what does that look like from a Childline perspective when you're sending that message to that young person to let them know it couldn't be taken down?

So I try to give them as much information as I can and explain why it didn't quite meet the threshold, because it may be that the image felt nude to them, or there may be nudity in it, but it still doesn't quite meet that threshold for removal. So, I explain that situation to them.

I also give them the options that might be available to them to take other actions which may involve going direct to platforms where they know the image might be and request for it to be taken down from — using their community rules. But we also offer them emotional support too. That can be through our counsellors or it can be through self-help tools like the Coping Kit. We also have a Calm Zone on the Childline website, which is really useful too, and it's very popular.

That all sounds like really helpful wider support for young people. Is that something that young people need to engage with to use Report Remove? For example would a young person need to speak to a counsellor if they wanted to report content?

No there's no need to engage with any other part of Childline. They can use Report Remove and that be it, and that be all they use from Childline. But if they do want that bit extra, there's lots of options available to them and they can pick and choose what feels right for them, what suits them as an individual, what fits in with how they like to communicate and be supported.

Brilliant. So, if a young person is choosing to engage in that extra support or they have just used Report Remove — either way — is there a chance that anybody else would find out that that young person has spoken to Childline or has used Report Remove?

Childline has a really high confidentiality threshold. You can have a look at the confidentiality promise on the website. It's beyond that of most of the services that work with the same age group and that same confidentiality promise that we offer through all our work applies to Report Remove as well. And it means we can keep so much more information private.

It's only in very, very rare circumstances that someone who has used Report Remove would have any information passed on to someone else, should any further safeguarding be required. And we would always try to make sure the young person or the child is aware that we're having to do that and why. But again, it's in very, very rare circumstances.

Thank you. And then Zara, are you able to just talk a little bit more to confidentiality on the IWF side. I know you said before that IWF staff would never know who the young person is, they'd only see the confirmation that they're under 18 and receive the images, but in terms of making sure that it's being taken down, how do the IWF ensure that that stays confidential?

So IWF analysts will view the content and assess whether it breaks the law in the UK. Only hashes are sent out to industry and our members to block the content. No one else will see the images. So we use our bespoke hashing tool called Intelligrade, which was built in-house at Internet Watch Foundation, and we will tag these Report Remove images with a special tag to show that it's been self-reported.

This will go out to industry as a 'self-reported image', so therefore when these hashes are shared with law enforcement, they know that it's been self-reported and therefore that person should not fear a knock on their door that they've been sharing child sexual abuse imagery.

The hashes is what's shared with industry members — the image itself is never shared.

That's right. It's just the hash.

Brilliant. Thank you. So that also means that the risk of the police getting involved, is that restricted to what you were saying before: it's only if it's needed as part of a safeguarding response, as part of Childline's confidentiality promise.

Yes. If we spot any potential safeguarding issues, we can relay that to Childline, who will do their own risk assessment.

And how can young people find Report Remove?

So Report Remove is available on the Childline website using the URL You can also search Report Remove Childline on Google and it should appear top of the search.

Professionals and parents can also visit to learn more about the tool from an adult's perspective. We're encouraging professionals, such as teachers for example, to become aware of the tool so that they can support young people who might approach them.

Is there anything out there that can help professionals tell young people about Report Remove and how to use it?

The NSPCC has an online elearning course that professionals can access to find out more about the subject of managing incidents of sharing nudes and how to support young people. There's also some printouts that are aimed at professionals that work with young people, again, accessible from NSPCC Learning.

We've also got some videos which can be accessed from the NSPCC website or the Childline website, just to find out a bit more. And again, professionals can look at the Report Remove tool on the website just to become familiar with it, find out where it is, what it looks like, and just become more confident that there is a tool out there to support the young people they work with.

Thank you. That's really helpful because I can imagine it can be quite challenging to have that kind of conversation with young people, especially if you're talking to a large group of young people about something that's potentially sensitive. So that's useful that the video is there to, sort of, do the talking for them and then give all the extra information around it to help answer any questions.

Is there any particular time that's important to share information about Report Remove, for example when an incident has occurred or in general?

I think any time is a good time. We know that the most common age range for using Report Remove is at the later teens, but we do get young people under the age of 13 using the tool as well. It's useful to share with young people before an incident occurs so they are aware of Report Remove and they know it's there should they need it, but also should they find themselves in a situation where they are worried about an image, to find out about it then is also very helpful. It's a safeguarding tool, it can help them either before a situation occurs or after.

Okay, so if a young person has disclosed that a nude image of them has been shared online, either to a professional, be it a teacher or a coach or anyone like that, or even their parent — that a nude image has been shared and they would like support with that. How can an adult support them with Report Remove?

So the adult can show them the Report Remove tool on the website. They can maybe show them the videos about the tool to help that young person build their confidence. But in terms of making a report, the young person can do that entirely independently. They don't need an adult to give any kind of permission or anything like that.

Report Remove is completely free to use, so there should never be any exchange of money for nudes to be removed from the internet. A young person doesn't have to pay to use Report Remove at all. We've tested the tool in consultation with children and young people, to make sure that it's clear, that it's easy to follow and that it's user friendly, so that it can be used completely independently by young people.

Many young people are using Report Remove, so we know we are providing a valuable service. So we would like you to encourage young people that are worried that their nude images have been shared online, to encourage them to report to us and we will do our best to get those images removed and prevented from being uploaded again in the future.

Thank you very much. That sounds really powerful that young people have a tool where they can take action themselves to report something that's been happening to them and to have an action taken from it, to have that content removed. And also especially important that it can help stop it being shared again in the future, like in that example that you shared Sam, for that young person, even if there wasn't an immediate worry, to know that it couldn't be shared online again in the future sounds like a really reassuring thing for young people.

So thank you both so much today for joining us to talk about Report Remove and how it works and how it can support young people. If anybody listening to this episode would like to find out more information, all of the resources that we've mentioned today, so those videos and the printouts and the wider information, can be seen in the show notes from today's episode.

The Report Remove tool itself is available on the Childline website at NSPCC Learning also offers an elearning course on managing incidents of sharing nudes. So thank you very much for your time today.

Thank you.

Thank you very much.

Thanks for listening to this NSPCC Learning podcast. At the time of recording, this episode’s content was up to date but the world of safeguarding and child protection is ever changing – so, if you're looking for the most current safeguarding and child protection training, information or resources, please visit our website for professionals at