‘Loophole’ in law on messaging apps leaves children vulnerable to sexual abuse, says NSPCC

In a concerning report released by the NSPCC, nearly 39,000 child sex abuse image crimes were documented in the past year, highlighting a significant vulnerability in the current legal framework that leaves children exposed on messaging platforms. The charity identified Snapchat as the most frequently mentioned app in these cases, underscoring the potential dangers of one-to-one messaging services.

Data Insights and Concerns

According to Home Office statistics, a staggering 38,685 crimes were recorded in England and Wales during the 2023/24 period, averaging over 100 incidents per day. Notably, police identified the messaging service used in just over 7,300 of these cases. The breakdown revealed that 50% of the incidents occurred on Snapchat, followed by 11% on Instagram, 7% on Facebook, and 6% on WhatsApp.

The NSPCC, alongside other organizations such as Barnardo’s, has reached out to the home secretary and technology secretary, advocating for stronger enforcement of the Online Safety Act. While Ofcom is tasked with overseeing the new legislation, charities have raised concerns regarding a loophole in the recent code of practice. This loophole permits direct messaging services to remove harmful content only if it is deemed “technically feasible.”

Challenges with Encryption

The NSPCC has expressed the need for platforms to take proactive measures to prevent becoming “safe havens” for abusers. They pointed out that services utilizing end-to-end encryption might inadvertently shield child sexual abuse material from detection, as the companies themselves cannot view the messages exchanged.

A poignant example of the risks involved is illustrated by the experience of a 13-year-old victim, who recounted her distressing encounter with a stranger on Snapchat. She shared, “I sent nude pics and videos to a stranger I met on Snapchat. I think he’s in his thirties. I don’t know what to do next. I told him I didn’t want to send him any more pictures and he started threatening me, telling me that he’ll post the pictures online.”

Calls for Urgent Action

NSPCC chief executive Chris Sherwood characterized the situation as “deeply alarming,” urging the government to take immediate action. He emphasized that the existence of separate regulations for private messaging services allows tech companies to evade responsibility for implementing robust protections for children, thereby enabling such crimes to persist on their platforms despite the introduction of the Online Safety Act.

Although the act, passed in 2023, mandates social media companies to mitigate illegal and harmful content, its protective measures are only beginning to take effect through Ofcom’s codes of practice. Last month, the Internet Watch Foundation (IWF) echoed similar sentiments, stating that the codes provide platforms with a “blatant get-out clause.”

In response, an Ofcom spokesperson expressed confidence that most services would be capable of removing harmful content. “The law states that measures in our codes of practice must be technically feasible,” they noted, while also asserting that platforms would be held accountable for non-compliance. Measures to protect children will include the obligation to review and report child sexual abuse material to law enforcement when identified.

A government spokesperson reiterated the commitment to combat child sexual exploitation and abuse, emphasizing that UK law unequivocally prohibits such activities, including on social media platforms. They affirmed the government’s dedication to the effective implementation of the Online Safety Act, aiming to make the UK the safest online environment for children. The spokesperson concluded by stating that tech companies cannot use their design choices as a justification for failing to eliminate these heinous crimes and that further actions will be taken to safeguard children from online predators.

AppWizard
'Loophole' in law on messaging apps leaves children vulnerable to sexual abuse, says NSPCC