
Children Want to Be Safe Online. Here's How We Make That Real.
Mar 31, 2026
Ask a young person what they're worried about on their phone, and the answers might surprise you.
Not their parents seeing their messages. Not their carers knowing what apps they use. What they're actually worried about is groomers. Spam callers. Strangers who won't stop messaging. People who made them feel unsafe.
We know this not because it sounds plausible, but because we tested it. In 2025, we partnered with Kids Industries to run co-creation workshops across three care settings across residential homes and foster families. We sat in rooms with children across the care sector, aged 8 to 16, and we listened.
What we found reframes the entire conversation about privacy, liberty, and digital safeguarding for children in care.
The Harm Is Already Here
Let's start with the facts, because the scale of what we're dealing with is too often softened in polite conversation.
Government data published in December 2025 shows that looked-after children accounted for over three quarters of all child sexual exploitation notifications in England between 2024 and 2025. Three quarters. These are children already in the care of the state, already known to agencies, already meant to be protected. And they are being exploited at a rate that should make anyone who works in or commissions children's services stop and reckon with it.
The link to digital is direct. The NCA estimates that between 710,000 and 840,000 adults in the UK pose varying degrees of sexual risk to children. Missing children (and over 2,400 children in UK local authority care went missing in 2024 alone) overwhelmingly communicate through apps like Snapchat and WhatsApp, which present serious challenges for safeguarding teams trying to locate them quickly. The College of Policing estimates that 7 in 10 young people who have been sexually exploited have also been reported missing.
These are not abstract risks. They are the daily reality for a significant proportion of children in care, and the smartphone in a child's pocket is frequently the channel through which harm reaches them.
What Children Actually Said
When we ran our workshops, we didn't go in with a fixed narrative. We asked children what they were afraid of on their phones. We asked what features they'd actually want. And we listened without filtering.
What came back was striking.
The ability to block any number is universally popular, across every age group from 10 to 16. Several children had direct experience of being persistently contacted by people they feared or didn't trust. One described being made to feel genuinely threatened by someone who kept calling. For these children, the ability to block that person (not just at app level, but at the network) wasn't a restriction on their freedom. It was freedom.
Content filtering? Broadly welcomed, especially by younger children. The analogy that landed best was school WiFi. You can still use the internet. You can still do what you need to do. Some things just don't get through. That framing (safety as infrastructure, not as surveillance) was the one that made sense to them.
Emergency location sharing? Mostly welcomed, particularly by girls. The idea of a phone being able to locate you if “something goes wrong”, not tracking your every move, just being there when it matters, felt like a safety net, not a leash.
The resistance we did encounter was specific and important: older teenagers didn't want to feel constantly watched. They didn't want adults who could abuse the power that comes with oversight. The phrase "Big Brother" came up more than once.
This is not a problem with safeguarding. It's a problem with how safeguarding has historically been implemented, as something done to children, not worked out with them.
The False Choice Between Safety and Liberty
There's a persistent myth in this space: that safety and freedom are in opposition. That protecting a child online necessarily means monitoring them. That the only way to keep them from harm is to lock their world down.
Voop was built to reject that entirely.
Network-level safeguarding does not mean reading a child's messages, Voop never does. It does not mean tracking their location at all times, location data is only accessed if a child is reported missing. It does not mean deciding who they can and can't speak to. It means that when a groomer tries to make contact, the network can detect and flag the risk before harm occurs. It means harmful content doesn't load. It means known bad actors can be blocked at the SIM level, not just within an app that a tech-savvy teenager can uninstall, but at the connection itself.
This is the distinction that matters. Traditional parental control tools are, as one of our co-founders put it, like a screen door: they look okay, but they're easy to kick through. A child in residential care who's been in the system for years knows exactly how to find the workaround. A groomer who targets looked-after children for precisely their vulnerability knows this too.
Voop doesn't give them a workaround to find.
Every feature is designed to pass the legal triple test that applies to any intervention on a looked-after child: Legality, Necessity, and Proportionality. Settings are opt-in by default. Children who are Gillick competent can meaningfully consent to or withdraw from specific protections. The intent, and the design, is a negotiated framework, not a surveillance regime.
Trust Is the Whole Game
One of the clearest signals from our research: the children most receptive to Voop weren't those who'd analysed the feature set. They were the ones who trusted their care providers.
"I trust my care provider, so I would definitely take the phone."
That quote, from an eleven-year-old, carries more strategic weight than any product spec. Trust in the adults around them transferred directly to trust in the tool those adults were offering. And that's as it should be, technology is the enabler, but the relationship is the mechanism.
This shapes how Voop is deployed. Adults introduce the platform transparently (what it does, what it doesn't do, what the settings mean) and negotiate those settings with the child. The research was unambiguous: this approach drives adoption, including among older teenagers who initially pushed back. It's also what the law requires.
When this is done well, something shifts. Carers feel less guilt about the apps they previously had no choice but to allow. Children don’t feel surveilled, because they understand and have agreed to the protections in place. The settings become a shared agreement rather than an imposition. And the actual safety outcomes such as reduced exposure to groomers, harmful content, and unwanted contact, are real.
This is what therapeutic safeguarding looks like in practice. Not monitoring for monitoring's sake. A framework for building digital resilience, collaboratively, in a way that respects each child's autonomy while closing the gaps that leave them exposed.
The Bigger Picture
We are at a turning point. Over three quarters of child sexual exploitation notifications in England involve looked-after children. More than 2,400 children in care went missing in 2024. The NCA has identified the better part of a million individuals in the UK who pose sexual risks to children, and they are reaching those children through the same social media platforms that children are on every day.
The smartphone isn't going to disappear. For many children in care it's their primary connection to the outside world, to friends, to siblings, sometimes to the birth family they still love. Taking it away isn't a safeguarding strategy. It's an abdication of one.
What is a safeguarding strategy is building the infrastructure that means a child can send a message, watch what they want, stay connected, without a bad actor, groomer, or exploiter being one unsolicited DM away.
That's not a restriction on liberty. That's what liberty actually looks like for a child who deserves to be safe.
If you work in residential or foster care, children's social care commissioning, or youth services and want to understand what Voop could mean for the children you support, get in touch.

More Posts

Children Want to Be Safe Online. Here's How We Make That Real.
Children in care fear groomers, not oversight. Voop's network-level safeguarding closes the gaps that put them at risk, without surveillance, without taking away their freedom.
Mar 31, 2026

Therapeutic Safeguarding: A New Era of Digital Protection for Young People
Voop's network-level solution provides digital safeguarding for children in care and vulnerable adults. Go beyond parental controls to prevent grooming, exploitation, and improve digital resilience.
Feb 10, 2026

Protecting Vulnerable People on Smartphones Requires a Technology Rethink
Smartphones have become the default device for communication, information, and connection. But for vulnerable people - especially children in care - they’re also a gateway to real and persistent harm.
May 22, 2025

Our Manifesto and Mission
Learn more about what we are building at Voop.
May 5, 2025