The messaging app Snapchat is the most widely-used platform for online grooming, according to police figures supplied to the children’s charity the NSPCC.
New figures from 45 UK police forces show 7,062 sexual communication with a child offences were recorded in 2023-24 – an increase of 89 per cent since 2017-18 when the offence first came into force.
The data, unearthed by the National Society for the Prevention of Cruelty to Children (NSPCC), shows social media app Snapchat was the most common platform which perpetrators wielded to prey on children online.
Almost half of grooming cases where the type of communication was revealed – which was 1,824 cases – involved Snapchat.
Meta-owned platforms were also found to be popular with offenders, with WhatsApp named in 12 per cent of those cases, Facebook and Messenger in 12 per cent and Instagram in 6 per cent.
Ella*, a 13-year-old girl from Glasgow, said: “Snapchat has disappearing messages, and that makes it easier for people to hide things they shouldn’t be doing.
“Another problem is that Snapchat has this feature where you can show your location to everyone. If you’re not careful, you might end up showing where you are to people you don’t know, which is super risky. And honestly, not all the rules in Snapchat are strict, so some people take advantage of that to do bad things.”
The youngest online grooming victim was a boy who was just five-years-old in 2023-24.
Researchers found girls make up the bulk of victims of online grooming – constituting around eight in ten of the cases where the gender was known in 2023-24.
Thomas*, who was only 14 when he was groomed online, said: “Our first conversation was quite simple. I was just chatting. The only way I can describe it is like having the most supportive person that you could ever meet.
“After about a month, the pressure started to build of him trying to prove that I was gay. That’s when he started sending explicit pictures and pressuring me to send images to him.
“I did send him pictures, but I didn’t like it and I didn’t want to do it anymore. He said he had saved the images and would send them to everyone if I stopped sending more pictures.”
He said he was overwhelmed with “constant fear in the back” of his mind, adding that it was not easy but he blocked the perpetrator on all sites.
Sir Peter Wanless, chief executive of the NSPCC, said: “One year since the Online Safety Act became law and we are still waiting for tech companies to make their platforms safe for children.
“We need ambitious regulation by Ofcom who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.”
He urged the governemnt to bolster the Online Safety Act so Ofcom has greater “legal certainty to tackle child sexual abuse” on platforms like WhatsApp and Snapchat.
This includes video games and messaging apps on consoles, dating sites and chatrooms, as well as social media chat applications. Perpetrators then manipulate children into talking on private and encrypted messaging platforms where their abuse is not picked up on.
Jess Phillips, Minister for Safeguarding and Violence Against Women and Girls, said: “Child sexual abuse is a vile crime that inflicts long lasting trauma on victims and the law is clear – the creation, possession and distribution of child sexual abuse images, and grooming a child is illegal.”
An Ofcom spokesperson said from December tech firms will be legally obligated to begin “taking action under the Online Safety Act, and they’ll have to do far more to protect children”.
“Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children,” the representative added.
“We’re prepared to use the full extent of our enforcement powers against any companies that come up short when the time comes.”
A spokesperson for Snapchat said: “Any sexual exploitation of young people is horrific and illegal and we have zero tolerance for it on Snapchat.
“If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities.
“We have extra protections including in-app warnings to make it difficult for teens to be contacted by strangers, and our in-app Family Centre lets parents see who their teens are talking to, and who their friends are.”
*Ella and Thomas’s names have been changed to protect their identity
Source: independent.co.uk