Nearly 39,000 child sex abuse image crimes were documented in the past year, with approximately 38,685 crimes recorded in England and Wales during the 2023/24 period, averaging over 100 incidents per day. Snapchat was identified as the most frequently mentioned app in these cases, accounting for 50% of incidents, followed by Instagram (11%), Facebook (7%), and WhatsApp (6%). The NSPCC and other organizations are advocating for stronger enforcement of the Online Safety Act, citing concerns about a loophole that allows direct messaging services to remove harmful content only if deemed "technically feasible." The NSPCC expressed the need for proactive measures from platforms to prevent becoming "safe havens" for abusers, particularly highlighting risks associated with end-to-end encryption. A 13-year-old victim shared her distressing experience on Snapchat, where she was threatened after sending nude pictures to a stranger. NSPCC chief executive Chris Sherwood called for immediate government action, criticizing separate regulations for private messaging services that allow tech companies to evade responsibility. The Online Safety Act, passed in 2023, mandates social media companies to mitigate illegal and harmful content, but protective measures are still being implemented. Ofcom stated that most services should be capable of removing harmful content, while a government spokesperson reiterated the commitment to combat child sexual exploitation and abuse and to implement the Online Safety Act effectively.