Intro:
Welcome to the NSPCC Learning Podcast, where we share learning and expertise in child protection from inside and outside of the organisation. We aim to create debate, encourage reflection and share good practice on how we can all work together to keep babies, children and young people safe.
Producer:
Welcome to the NSPCC Learning Podcast. The online world is constantly changing and young people are often at the cutting edge of these changes. They're often more informed about online trends than adults, and are equipped with unique knowledge and understanding of what they need to know and do to stay safe online. It is important to listen to them and try and incorporate their voice into your online safety work.
In this podcast episode, recorded in January 2025, we'll be doing just that. You'll hear from two members of the Voice of Online Youth, a group of young people aged 13 to 17 who help advise the NSPCC and the wider online safety sector on how to help children have safe and happy experiences online.
Will and Zara will provide their own insights and experiences on what online life is like from their perspective, including what they get up to online; what steps they take to stay safe; what worries them about being online; and what they'd like to learn more about when it comes to online safety. I'll hand over now to Will and Zara to introduce themselves.
Will:
Hi, my name is Will and I guess I joined the Voice of Online Youth because I really think it's great to give young people a voice when often in this space they're overlooked, even though online safety is centred around young people because they're the ones often online. I really enjoy public speaking, so I thought this would be a great opportunity to give a voice to young people when it comes to online safety.
Zara:
Hi, my name is Zara and I joined the Voice of Online Youth because I want to make the internet a safer and more positive place for people my age. A lot of the decisions about online platforms are made by adults who don't fully understand what we go through or what matters to us, and I want to make sure our voices are heard.
Producer:
Thank you both so much for joining us on the podcast. In this podcast, we want to try and provide an understanding of what online life is like for young people.
So, my first question: what sorts of activities do you get up to online?
Will:
So online nowadays is almost everything that I do because it's not just playing video games and talking with friends and things online. It's also that school is involved online. I look at all my homework online and I do quizzes online.
There's quite a lot online that I feel like people wouldn't really notice is there. It's just sort of passively online and it's there.
Zara:
I spend a lot of time online doing a mix of things. Social media is probably the main one. I keep in touch with friends or just see what's trending. I also stream music or watch shows, but I feel like when adults think of online... Well, uses for the internet, they don't have a nuanced view. They think it's just for entertainment.
But, similar to what Will said, we have a lot of our school work on there or anything educational. It's not just one singular thing.
Will:
It's not like being online is a necessity. It's just really, really useful. And it's a tool that a lot of people can use and get a lot of use out of. Sure, I could just dial up my friends on the home phone, but it's a lot easier to just Facetime them and then be able to see them as well. And it's more convenient.
Producer:
What's the one thing that you wouldn't be able to do without when it comes to being online?
Zara:
I feel like for me, schoolwork is one of the big ones because, frankly, I don't see anyone going to the library and getting textbooks; most of them are online for me. And if I couldn't do that, I'd have to stay after school and then it would make my journey longer and it would affect other things in my life.
And also, just talking to friends. Some of us live really far away, so we can't really meet person to person, and it's just, all in all, more convenient.
Will:
I totally agree with Zara that schoolwork is really useful to have online, but I'd say if there's one thing that would have to, you know, take the top, it would probably just be as simple as talking to your friends online. I think it's really overlooked that you can just send people messages and they'll get back to you really quickly. Whereas, you know, before the internet was really a thing, that just couldn't happen.
Producer:
So it's clear that being able to go online is important to both of you. My next question is more to do with online safety. What makes you feel safe and comfortable online?
Zara:
I feel comfortable online when I know the platform I'm using really well. Because if I don't, I feel if I see something bad, how am I going to report it? Or how am I going to tell someone? Familiarity. It makes it easier to navigate and I know how to handle settings like privacy controls or reporting tools.
It's also reassuring when I'm part of communities that are really well moderated and there's no toxic behaviour or, well, it's not known for toxic behaviour. I generally feel safest when I'm in control of what I see or what I share or what people say to me.
Will:
Zara's pretty much hit the nail on the pin there— on the pin? On the head. But yeah, I feel like there's not too many times where I'm feeling uncomfortable online now that I'm just familiar with everything that I see. Yeah, it's just so normalised that, because I'm in control of pretty much everything that I see, there's nothing terrible that makes me feel really uncomfortable.
Producer:
I wanted to pick up on Zara's point about moderation and privacy settings. Do you both take control of your own privacy settings, or is this something that maybe your parents were involved in? Or maybe a bit of both?
Zara:
I feel like when I was younger, my mum would always make an account for me and she just put me on child's settings and then she'd think I'd be safe. Now I just... More than that I just alter my 'For You' page more on things I don't want to see, rather than just outright block everything that's not appropriate, because then I can just feel safe and I can still find stuff that I like and be entertained by.
Producer:
And just quickly, when you say the 'For You' page, what app are you talking about here? Is it Instagram? TikTok?
Zara:
I still use TikTok and it's got a button where you can say, "I'm not interested in this". And whenever I get a new app like Instagram — I got Instagram a few days ago and the whole day I just spent sorting it out, going through a bunch of videos, saying what I like and don't like. So then the next day, it's kind of used to content that I enjoy, and I think that's really important because otherwise, you know, algorithms won't work or anything.
Producer:
I'm really struck by what you say Zara about manipulating the algorithm so that you're getting the content that you want to see.
When it comes to things like moderation, are you approaching it from a "I know what I want to see" kind of way? Or is it more of a "I know I don't want to see" kind of way? And are there settings within the apps that you will set up to block any content that you don't really want to see?
Zara:
I feel like it's a mix of both. For example TikTok, if you just say "I'm not interested" in stuff, the stuff that you are interested will just be there. But other apps work a bit differently. For example, Instagram: I like the videos that I like and enjoy, so then those mainly come up. But for TikTok it's the other way around. I just say "I'm not interested in this". So it just varies across apps.
Will:
Well, I mean, it was really interesting seeing what Zara had to say about manipulating your algorithm, because I kind of do that too. You know, you have to like the videos that you like, straight away just scroll past videos that you wouldn't find interesting, even if— Say you stumble upon a video that has, you know, completely opposing views to you, and you want to view that video just out of curiosity because what have I got to say? I don't like this kind of content, but still, I'm interested in the opposing views to mine. And then you just suddenly get loads of videos like that, hundreds of videos that are just not your views. And then you're like, "oh no, what's happened here?"
So you have to be really careful with what you watch and what you like and what you dislike.
Zara:
I think yesterday, I was watching something out of curiosity, and then the next day my whole 'For You' page filled with that. And then it didn't let me go back to my original 'For You' page for a couple of days. I think that's definitely a really good point.
Producer:
And do you worry about algorithms and how they're affecting what you see?
Zara:
Maybe not for me, but I feel like for other people, because if they see that repeatedly... I don't know about you, but if you see a story and you hear it from one person, I feel like you're more likely to believe it from the first person you heard for some reason. And I feel like if other people saw algorithms just, you know, only conveying one view, they'll be more likely to believe that. And then it just creates a whole cycle.
Will:
Yeah. I was just going to say it is kind of crazy how futile algorithms are. The other day, I was just trying to make a point to my TikTok that I want to watch lots of stand up comedians. So I was following all these great stand up comedians, I was like, "oh, this is great".
And then the next day, because of Donald Trump's inauguration, my whole algorithm was filled with content about that. And then suddenly, now that it isn't as relevant anymore, I'm now getting stand up comedians again. So I just wish it was more permanent. But it still needs to be easy to change.
Producer:
Just really fascinating to hear you both talk about the algorithms and how you can manipulate them or what changes you'd like to see.
My next question — we sort of touched on this because it's a flip of the question I've just asked — what makes you feel less comfortable online?
Will:
Yeah, I guess fairly similarly to what we've already said, just videos coming up that we don't feel comfortable with that are hard to push away. Especially when if you watch the entire video, even if it's just by accident, like you leave it sitting on the counter and it runs twice, then you're going to get lots of videos like that one. So yeah, I guess better control would be very helpful.
Zara:
Something that makes me feel less comfortable is when there's misinformation, but when it's a large amount. For example, for my 'For You' page, it's a mixture of what I like and what's trending. And sometimes people like stuff that isn't true or AI deepfakes or something.
For example, my dad sent me a really popular video and I was like, "oh, this is going to be funny." It was an AI image. And my dad's like, "oh my God, that's so cool. Have you seen this before?" And I was like, "no, dad, it's AI".
Producer:
Is it something that you'd like to learn more about in school — the impacts of misinformation or how these things work, how these technologies work?
Zara:
Yeah, definitely. Today I had PD day — which is personal development, but I think some schools have PSHE — and the theme was online safety. I don't think we did anything on AI or misinformation. It was just cyberbullying, don't post certain stuff, digital footprint; just like the normal things that we learn in school.
And if you do want to learn about it, I feel like you have to actually research it. And then sometimes the research that you do is biased. So, I feel like it'd be really good if you had a reliable source of information on how to tell what's AI or not.
Will:
Like Zara, I also think it's interesting that schools are still, in today's day, repeating the same information that lots of us have heard again and again about cyberbullying and stuff that, you know, we all know is important, but we've heard it a million times. Whereas I've never heard a teacher mention AI to me in a way that wasn't just conversational.
So, it would be really interesting to see the curriculum updated to mention current topics and affairs. You know, artificial intelligence is sort of a sci-fi movie concept, but it's real today. It's reality, and we need to be taught about it.
Zara:
I 100% agree with Will, because the internet and the online world is evolving, but our curriculum isn't. For example, my teacher — she was giving an example of an app and she said 'MySpace'. And then she said 'Vine'. And mind you, most of us are Gen Alpha, Gen Z, so none of us knew what it was unless we'd seen skits. And we were so confused, because she was giving us an example of how to report something, and we were so confused the whole time.
Producer:
Yeah. And with young people using so many different apps, it can be tricky to have knowledge of all of them.
When you make those decisions to say, "oh, I'm going to use this new app", do you talk to anyone about it? Do you talk to your teachers?
Will:
I mean, I guess when I first got Instagram, not a lot of people that I knew were on it. So there's barely anyone to talk to about it, to be fair. But, you know, you'll mention it to your friends like, "oh, I've got Instagram and there's this feature that I didn't know is there, and there's this and that."
Zara:
I don't think anyone thinks about talking to a teacher. If I was getting Instagram, I didn't go to my form tutor and ask "Miss, should I get Insta?" I just thought of getting it. So I just went on my app and downloaded it. But I feel like even if I did, I feel like they'd tell me not to or say something, you know, negative. Or if I said I'm going to get something, they're just going to be like, "make sure to set, you know, blocks and stuff on there."
I feel like we only hear teachers talk about how to stay safe online, like it's so bad that we need to be taught how to stay safe, not to enjoy it. It's normalised that it's bad and we should know how to deal with it, rather than what things it could be useful for.
Will:
Yeah, definitely there needs to be more positivity about being online. But again, you can't just completely ignore the negatives, because there are negatives and they are important. It's sort of important to have a non-biased view of things and a balanced view of things. Otherwise it's just not going to work.
Producer:
Yeah, my next question was going to be more about those worries and concerns and possibly negatives rather than positives. What are the things about being online that concern you the most right now?
Will:
I guess the thing that is most concerning is people's lack of awareness. If you look online and you see something, I can usually tell if something's AI generated, but it's getting better and better. But there's so many people in the comments section, they're like, "wow, is this real? I never knew that, that's crazy!" Because why wouldn't you believe it? Because it's on your phone, and BBC News is on your phone. It's only one click away from your TikTok feed.
Zara:
That's a really good point, because if you think about concerns, people are just going to say generated images or deepfakes. But the actual aspect of people not being aware of how bad it is, is much scarier to think about.
This morning we had an assembly on how AI was, you know, manipulating people, which is actually quite good for my school; quite progressive to talk about AI. They gave an example that someone cloned David Attenborough's voice and the person next to me said, "oh, that's so funny. Imagine people are listening to that and getting pranked." I think she gave an example of just David Attenborough saying, "oh no, the house cat has gone extinct." And then people, you know, believing that. And you feel like it's normal to make jokes but I don't think a lot of people understand that it could be used for much more malicious purposes.
Producer:
And speaking of malicious purposes, are either of you worried about the potential use of generative AI in relation to cyberbullying?
Will:
I mean, it's easy to worry about it because it's already happening. You know, people are getting images made of them that are just— obviously they aren't them. But to a lot of people, they see that image straight away and think, "that's my friend. Why are they doing that?" And it's concerning the effects that AI can have and is already having.
But, as well, you have to look to the future. Because if AI is like this now, and you compare that to AI from last year, there's already been so much progression that it's only going to get more and more realistic. So change needs to happen now to stop more concern in the future.
Producer:
How would you advise schools approach the problem of generative AI?
Will:
I mean, it would benefit a lot of people if they just had one lesson in their entire life that said, you know, how to spot AI images, look for extra fingers and that kind of thing. Because a lot of people... Even though a lot of people would assume everyone from the younger generation is super-prepared and knows everything about being online and they're constantly online, even people who are constantly online don't have to be 100% informed about the risks or about AI.
Zara:
Everything that Will said I agree with, especially about how it has to be done now. For example, how you have to write that something is AI generated, or if something, you know, is made by AI. Even companies now are getting away with it. For example, chatGPT. It says chatGPT can make mistakes and blah blah blah. But it's in really tiny font and it's in a separate tab thing, you have to click an arrow for it to appear, and you have to actively search for it to say that it's AI generated. Especially in images where you can make AI generated images, it's not even a watermark anymore it's a tiny, like, writing bit on the edge of it.
Will:
Yeah. On Zara's point, what she was saying about how ChatGPT has a very small marker saying "this could not be true." AI's like ChatGPT, Google Gemini, they'll present pretty much everything as solid fact. I know it went viral a short while ago: if you ask ChatGPT how many R's are there in the word ‘strawberry’, it literally doesn't know. But it'll say to you, "oh, there are five R's in ‘strawberry’", and it's not going to say "sorry, I just don't know how to answer that" because it can't. It's not programmed to say "no" to anything that isn't illegal, basically.
Zara:
There is someone saying "one plus one equals three" to chatGPT so many times that it just said, "yeah, it is three". So many other people did that. And now if you ask, there's a chance that it might say "three" because ChatGPT is not based just on AI, it's kind of general knowledge and it just picks the most looked at.
For example, if you just Google on Google Gemini, it just picks the top answers and then puts it there. So it's not definite fact but people take it as definite fact. And if you're googling something, you don't want to go into article, you want it in a tiny little square at the top of your page, because I don't think people really take the initiative to check if the information is true or not. They just, you know, go with it.
Producer:
Moving on from generative AI a little bit, I wonder if there are any other things about being online that concern you at the moment?
Will:
I guess if there was one other thing other than generative AI that's concerning, it's the amount of control that you don't have over what you see.
I know we were mentioning earlier about trying to manipulate your algorithm into giving you the videos that you want. But nowadays people can buy views. Like literally, if you upload a TikTok video, it'll say, "oh, this video is doing great. Want to make it do greater? Pay us £5 and then we'll give it 100,000 views." That's scary that you can just manipulate what other people see. And that means that you're not in control of what you see either.
Zara:
I know how we were talking about, you know, manipulating your 'For You' page, but I don't think anyone really understands how algorithms actually work or why it's tailored to me. I know it's just because, you know, if you like something, it will come up to you, but also the impact on following people.
For example, when I got Instagram, I followed my friends and the videos that they watched came to me even before they sent it to me. So I'd really love to learn how that works as well.
Producer:
How does that make you feel when you're getting served the same content that your friend is before they even send it to you? Does that feel strange?
Zara:
I think it does feel strange, because if you think about the 'For You' page, it literally has the word "you", not "for your friend" page. But, sometimes... I used to think it was kind of funny because, before my friend would send me a meme, I could say "I watched that already". But now it's kind of... I don't really like how it impacts stuff, because even though they are my friends, some stuff I don't agree with or I don't find as funny, but it'll just be on my 'For You' page. So I think it's something that needs to be looked at.
Will:
Yeah, I totally agree with you Zara on that point because I've been having the same problem recently. My friend sends me a video and 80% of the time I've already seen it. Is this really a 'for me' page? Because it seems like everyone in my friendship group's getting the exact same posts, and that's just strange to me.
Producer:
And do either of you worry about the data that these apps and these online companies are gathering to power their algorithms? Is that a concern for either of you?
Will:
I feel like you're kind of aware that they're taking your data. And because it, you know, says it in the name — it's 'for you'. It feels like I don't mind too much that they're taking it because at least they're given me videos. But it is concerning when you don't know where that data might end up outside of the platform that you're on.
Zara:
I feel like you obviously consent to stuff. To be on the app, you have to click 'consent', otherwise you can't use the app itself. And I think people just complain for the point of complaining, but you can actively look for it.
But I remember we were working on our manifesto last year, and we were talking about how a lot of young people, or just people in general, don't really look at it; they just click 'accept' because, you know, I want to see the videos. And I think— I think it was Will's group, they made the idea of having a little pop-up tab of the vendors and stuff, so then people can see where the data gets sent to. But it's only a summary of it, so then more people are more likely to read it?
Producer:
I'm conscious we're nearly at the end of our time, but I have one concluding question for you both. What are the top three things you think adults should know about what online life is like for young people?
Will:
I've got three. They're not really in any kind of particular order. So, my first one being that generative AI isn't just homework answers. It is a lot more than that and it can be genuinely quite dangerous.
My second one is that algorithms are a big part of young people's lives and they aren't super easy to control or get out of your life. It's not like you can just put the phone down a lot of the time. That doesn't mean it's super hard to either. It's like... It's a controllable yet uncontrollable part of your life.
So, I guess my last point is it's not as easy as just hitting block or turning off your phone to stop an issue that's happening because you can't just switch off your phone for the rest of your life. I feel like even adults would know that. You know, I know so many adults that are scrolling on Instagram Reels all day and then belittling young people for scrolling on TikTok all day — it's the same thing. But you can't just hit block either, because, you know, people constantly make new accounts, they find new ways, and they adapt around changes.
Producer:
Thanks, Will. Zara, what are your thoughts on this question?
Zara:
I have one key point that goes into three. Online safety isn't just cyberbullying, because I feel like that's the main focus of every single PD lesson. And also, the complexity of cyberbullying. It's not just someone blatantly saying, "I don't like you." It kind of includes microaggressions and even stuff that teachers don't realise, like if someone didn't tag you in a post, that would hurt, or something like that. And it's not just someone commenting on your post or DMing you privately, but it's other stuff, more detailed stuff. And I feel like schools should be more aware of that.
And also, addictive behaviour: I feel like schools think that we're the ones at fault for it, like we're actively thinking, "oh I'm going to go on my phone. I'm not going to do my homework. I'm going to spend six hours on TikTok in my bed." But I don't think they understand the role of algorithms. It's designed to keep us scrolling for hours. That just means that doing their job well. So it's not just us at fault.
And also the emotional impact of misinformation. I feel like a lot of schools just teach us not to spread information because, you know, it could be used maliciously. But even hearing it, a lot of people feel emotionally connected to something. And if they hear something against that that isn't real; or they know it isn't real and then loads of people are agreeing with that; or your friends have seen it and they're like, "yeah, that's definitely true, I saw it ten times", it could be overwhelming and it's really confusing.
Producer:
So that brings us to the end of our discussion today. Zara. Will, thank you so much for joining us on the podcast and providing a really interesting insight into your thoughts and opinions on the online world and online safety.
Will:
Thank you so much for having us.
Zara:
Thank you.
Producer:
If you've enjoyed this episode and would like to hear more from the NSPCC's Voice of Online Youth, you can find a link to their manifesto for change in the podcast shownotes. You'll also find links to lots of other resources and training to help you keep children safe online. Thanks for listening.
Outro:
Thanks for listening to this NSPCC Learning podcast. At the time of recording, this episode’s content was up to date but the world of safeguarding and child protection is ever changing – so, if you're looking for the most current safeguarding and child protection training, information or resources, please visit our website for professionals at nspcc.org.uk/learning.