More children are being groomed on Instagram than on other social media platforms, new figures suggest, leading to calls for tech companies to face stronger child welfare regulations.
Overall, police in England and Wales have recorded more than 5,000 cases of online grooming since having sexual communications with a child became a crime in April 2017, child protection charity the NSPCC found.
Instagram was used in a third of cases where a method was disclosed, while Facebook was used in 23% of cases and Snapchat in 14%. The number of cases on Instagram that police dealt with rose by 200% in the space of a year.
Girls aged 12 to 15 were most likely to be targeted by groomers, and victims included children as young as five years old, according to the group, which based its figures on freedom of information requests to 39 of the 43 police forces in England and Wales.
A fifth of victims were 11 years old or younger.
“These figures are overwhelming evidence that keeping children safe cannot be left to social networks,” Peter Wanless, the NSPCC’s chief executive, said in a statement.
The charity attacked “10 years of failed self-regulation by social networks” and urged UK Home Secretary Sajid Javid to create an independent regulator for the sites to enact mandatory child welfare rules.
“We cannot wait for the next tragedy before tech companies are made to act. It is hugely concerning to see the sharp spike in grooming offences on Instagram, and it is vital that the platform designs basic protection more carefully into the service it offers young people.”
Overall, the group found a 50% increase in offenses in the most recent six-month period available, compared with same period the previous year.
But the true number of cases may be higher, the charity noted, because instances can go unreported and police-recorded offenses will therefore “not fully reflect the scale of the issue.”
The findings nonetheless add to the widespread criticism social media companies have faced in recent months.
Numerous groups have called for Facebook, Instagram and other platforms to face independent regulation in the past year, citing failings over child safety, data protection and the spread of misinformation.
Last month, UK Health Secretary Matt Hancock urged the sites to take action over content that could encourage children to self-harm or commit suicide.
Responding to the NSPCC report, a spokesperson for Facebook and Instagram said: “Keeping young people safe on our platforms is our top priority and child exploitation of any kind is not allowed. We use advanced technology and work closely with the police and CEOP (Child Exploitation and Online Protection Command) to aggressively fight this type of content and protect young people.”
A spokesperson for Snapchat said: “While we understand that any service that facilitates private communication has the potential to be abused, we go to great lengths — within the bounds of existing law — to try to prevent and respond quickly to this type of illegal activity on our platform.”