The Dark Side of Digital Dependence for Young People
- Abianda

- 5 days ago
- 10 min read
On 13 November 2025, the NSPCC released an article detailing findings from their research into the online grooming of children through End-to-End Encrypted (E2EE) messaging platforms with some extremely concerning statistics. In the same month, OnSide Youth Zones published Generation Isolation, a report highlighting the social lives of young people and the ways in which digital spaces shape their connections and experiences, where it highlighted that young people spend 48% of their free time in their bedroom and 38% felt they could develop a genuine connection with someone even if they had never met them in real life.
Reports such as these motivate me to research further and understand better so that I can share my views, concerns, findings of best practice, tools and more on behalf of young people to the professionals they work with, as I don’t understand how we have arrived at such as point where we see these statistics, as youth professionals or trusted adults with a vested interest in the wellbeing of the children and young people we have an interest in, in our communities, and still find ourselves not identifying ways of regularly and widely delivering safe spaces online for young people, or being more available to young people online when they need help.
This is why my contribution to the 16 Days of Activism campaign through Abianda, is focused on shining a light on the digital world: how young people use it and the risks it can pose.
Young People, Gender, and Digital Risk — Why Girls and Young Women Are Particularly Vulnerable
The theme for the 16 Days of Activism Campaign this year is Digital Violence and a campaign highlighting how this impacts Women and Girls globally is long overdue. Digital violence is one of the fastest-growing forms of violence against women and girls globally, affecting approximately 1 in 3 women worldwide. The NSPCC found in their Targeting Girls Online report (May 2025) that only 9% of girls report feeling safe in online settings.
Digital Violence encompasses a wide range of behaviours, including online harassment, trolling, doxxing, revenge porn, catfishing, misogynistic networks and the misuse of AI - particularly for creating “deepfakes” and cloning voices. While this is not an exhaustive list, it highlights the diversity of ways young people — especially girls — can experience harm online. Some worrying, and recent, statistics I’ve found around this include:
80% of children targeted for grooming on E2EE platforms are girls.
99% of deepfake intimate image abuses depict women
Nearly 70% of boys aged 11-14 have been exposed to misogynistic and other harmful views online
In the last 4 weeks* 20% of teenagers had seen or experienced content that objectifies or demeans women
*four week period not provided in source: Ofcom Guidance: A safer life for women and girls, published 25th November 2025
The digital world - to me - remains unchartered and unmonitored territory for the majority of adults, that young people are left to navigate with minimal supervision and guidance from trusted adults. It is very rare to find children as young as four navigating physical spaces and places around them independently, and yet quite worrying this was the age of the youngest victim in the NSPCC’s article looking at online grooming via E2EE platforms.
Understanding the history of digital platforms contextualises the risks:
Year Event
1995 Internet widely available to the public
2006 Facebook opens to public
2009 WhatsApp released
2010 Instagram released
2011 Snapchat (as Picaboo) released
2023 Online Safety Act enacted
25 Jul 2025 Platforms legally required to protect children online via the Online Safety Act
It was 12 years after Snapchat launched to the public before a parliamentary act focused on online safety for children was passed, and a further 2 years before platforms were legally mandated to protect children online in the UK.
I highlight this timeline because I want to make it clear that it is not enough to be too reliant on legislation and systems built by adults who don’t take into account the nuance of situations such as that of Roblox - which I will detail later in this piece - or don’t understand the difficulties police will face investigating crimes that occur on E2EE messaging platforms, which are in fact Digital Black Boxes. Police cannot read messages without physical access to devices or accounts, this is what E2EE does. Messages only exist on the devices being used by the individuals with those accounts, it will not be stored by platforms such as Snapchat.
This is one area of risk to children and young people that the adults in their lives need to be paying much better attention to.
Why Telling Young People “Go Offline” Doesn’t Work
Children and young people now navigate a world in which schoolwork, homework, travel, employment, training, and social support networks are all online. The OnSide Youth Zones report details how young people use these spaces for connection and advice, highlighting that 39% of young people turn to AI for advice, support or companionship and of that 39%, 55% use AI for advice on life skills, mental health and relationships. Additionally, nearly 40% of young people feel that they can build a meaningful relationship with someone they have never met offline.
This shows that it is not a choice for them, but a lifestyle they have grown up in with no alternative.
In sociology we are taught about social learning phases of human development, with primary socialisation predominantly occurring within the immediate family network and secondary socialisation occurring with individuals outside that family network. I like to think of the digital world as a new form of tertiary socialisation, separate to the informal tertiary socialisation we discuss in sociology. This is because for children and young people this tertiary socialisation isn’t something that might occur as a result of major life changes - this is something that will occur and will continue to occur for future generations - learning important social norms from the internet. This also means the other, informal, tertiary stage will need a new name but I digress..
From what I have explained, it should be clear that telling a young person to “block” someone or “come offline” is increasingly ineffective when it comes to harassment, bullying, grooming, or exploitation. The avenues for managing these issues are limited because:
Blocking someone doesn’t stop them or others from making fake profiles to continue the behaviour
A young person can only remain offline for so long when even their school work requires them to be online
Many youth services maintain professional boundaries that restrict youth workers or trusted adults from interacting with young people via social media or online platforms.
The internet is a global asset, and tech companies can exploit jurisdictional loopholes to limit accountability.
Support for victims of digital violence is inconsistent, even in the UK — many young people wouldn’t know where to go for support if they experienced digital violence on Instagram or Snapchat - particularly if it relates to a deepfake or other compromising image
So what can we offer young people, and especially young women and girls instead as a solution if we can’t give them that advice? For this, I would like for you to bear with me a bit longer as I share my experiences on Roblox, which has given me hope, insight and practical suggestions I can share about what we, as adults, can actively do to show we are there for young people when they need us in an online space.
Alternatively, you can skip ahead to my call to action and come back to the Roblox Case Study later.
A Case Study: Roblox
Firstly, Roblox is not a children's game!
Though this may have been what you have heard or assumed glancing over someone else playing on the app. Roblox is actually a complex platform hosting thousands or even millions of user-generated games, some public and some private that a user can invite their connections (friends) to.
Games developers have access to servers and analytics, as well as access to an invaluable library and community of games developers for free or low cost - making games development a more accessible career than many of us realise. Users can purchase in-game and user generated content items using Robux, that can only be obtained with monetary payment to Roblox. Many developers are independent and new, testing, developing and expanding their skills as well as building a portfolio of work. They sustain themselves through the income they receive when users spend Robux in their game or on their content.
Sometimes though, the developer can be a child or young person, or me trying out a new hobby given how accessible Roblox makes games development for everyone.
So, of course,Roblox is not without its risks. I have seen fights break out between siblings and groups of young people in my work (outside/before Abianda!) because of Roblox, and they bear striking similarities to the conflicts that can arise on other platforms between young people in relation to violence that can affect them in real life. This is because the digital world is also a playground of creativity and innovation and some of that - in Roblox - plays out as players bypassing moderation filters to engage in bullying, harassment, swearing, discussion of self-harm or abuse the individual was experiencing, and exchanging of personal information including contact details.
However, Roblox has empowered users who have created games, to be able moderate content through in-game moderation. I assume this is in recognition of the fact that their “safety-by-design” tools might not be enough on their own. This is a voluntary position and depends on if the games developer wants to implement this feature in their game - which highlights one of the loopholes I mentioned earlier in terms of protecting young people from Digital Violence. It is quite clear that this is one instance where to see real change on Roblox, you would need to apply the Online Safety Act to the users developing the games on its platform, not the company itself.
However, I will say that Roblox’s ‘safety-by-design’ tools are highly effective compared to other platforms - adults just need to be aware of where they are, what they are and how to use them.
Call to Action: Supporting Young People Online
Abianda works with young women and girls affected by criminal exploitation and violence and specialises in co-creating safe spaces for young women and girls to share their views, explore themes affecting them and create systems change in the places and spaces around them. We do this through:
Delivery of (currently free) Group work Programmes - across London
Developing training based on the experiences and opinions of young women to upskill professionals - across the UK
Providing 1:1 and other group spaces for young women and girls to feel safe in to deliver systems change work - in Islington
Through this work it has been a pleasure to support our Young Women’s Advisory Group to create a blog post on a digital violence issue that matters to them - do look out for it on Friday 12th December!
However, between this work and having had the privilege of playing alongside parents who have developed effective strategies for protecting their children, I have developed what I think are some gold standard of actions trusted adults and youth professionals can take to safeguard children and young people they care for online.
I would like to highlight, before sharing the suggestions, that the children of these parents are happy, confident and bold when I see them online, because they have a network of age appropriate friends, vetted by their parents, but also a network of adults they can call on when they need to in that online space. The parents also have a network of parents and adults they can trust to look after their child (or advise them when they’re child is online and shouldn’t be - similar to the neighbourhood mums that caught us in the cornershop as children!) and also share a hobby with. So to me the benefits from what they do outweigh the potential negative impacts of what they do and that is why I am sharing these actions.
I acknowledge that some of my suggestions for supporting young people in online spaces might come across as invasions of privacy, but the online spaces available to our children and young people for socialisation and recreation, especially our young women and girls, are in essence public spaces where they interact with other members of the public.
If you have restrictions, agreements and guidance to support a child or young person to be safe and well in physical public spaces - including supervising them where necessary - then the same should be applied to digital public spaces.
So read on for my suggestions - do feel free to reach out to us at Abianda to let us know if you want to take on one or more of them and start actively playing a role in online safety for children and young people, especially young women and girls today!
For trusted adults, particularly parents and caregivers:
To help protect children in digital spaces, it is essential to actively understand and engage with the platforms they use:
Understand the platform and adjust settings: For example, on Roblox, consider turning off in-experience chat and the messenger function if you do not want your child messaging others. Adjust game ratings to control accessible content, and verify the account belongs to your child.
Maintain account access: Keep login details to review settings periodically and support moderation claims against your child’s account. If messaging is a concern, access to the account allows you to monitor communications safely.
Play alongside your child: Participate in games to bond and observe who your child is interacting with online and what games they play. If this is a social media challenge, participate in the challenge with your child - you won’t be the only parent doing it!
Adjust privacy settings on your child’s account: Ensure you can see when and where (which game - if Roblox) your child is online, and do not be afraid add friends on their connection list and check who they are, especially if you are unsure whether an account is genuinely belongs to a child.
For professionals and youth workers:
To support young women and girls effectively, combine understanding of digital risk with trauma-informed practice and youth participation:
Engage with digital youth work strategies: Read the National Youth Agency’s Digital Youth Work Strategy and commit to piloting at least one form of digital youth work this year.
Build expertise in criminal exploitation: Book a course or training through Abianda to understand how criminal exploitation affects young women and girls specifically, and learn effective tools and methods to provide support.
Use group work to amplify young voices: Invite Abianda to facilitate a group work programme at your setting that allows girls and young women to tell your organisation what services and support they need, exploring topics related to their direct or indirect experiences of criminal exploitation.
By taking these steps, trusted adults and professionals can help ensure young people — particularly girls and young women — are safer online, feel empowered to seek help, and have their voices central to the design of the support they receive.
By Rachel, Abianda Participation Practitioner




Comments