child safety

AppWizard
April 23, 2025
Law enforcement officials in New Jersey have filed a lawsuit against Discord, claiming the company misled parents about its safety controls and failed to protect children from sexual and violent content on its platform. New Jersey Attorney General Matthew Platkin stated that Discord has violated consumer protection laws, exposing children to risks from online predators. The lawsuit alleges that Discord's safety features were inadequate and that the app's claims of protecting young users were misleading. Specifically, it criticized the "Safe Direct Messaging" feature for failing to effectively detect or delete explicit content. The lawsuit seeks remedies including an injunction against further violations, civil penalties, and forfeiture of profits generated in New Jersey.
AppWizard
April 23, 2025
Florida Attorney General James Uthmeier has filed a lawsuit against Snapchat due to concerns that the platform is being misused by predators to target minors. The lawsuit claims Snapchat violates Florida's House Bill 3, which prohibits children under 13 from creating accounts and requires parental approval for users aged 14 and 15.
AppWizard
April 8, 2025
Meta is expanding its Teen Accounts initiative to Facebook and Messenger, following the initial rollout on Instagram. This expansion aims to create a safer online environment for users under 18 by automatically setting accounts to private and limiting access to certain features. The rollout begins in the US, UK, Australia, and Canada, with similar protections anticipated for other regions. Current protections for Teen Accounts include restrictions on messaging with strangers, tighter controls over sensitive content, reminders to limit screen time after 60 minutes, and a sleep mode that mutes notifications overnight. Older teens can disable some protections, but users under 16 need parental permission to make changes. Meta also plans to introduce additional protections on Instagram, including restrictions on live broadcasts for minors and a feature that blurs nudity in direct messages, which also requires parental permission for users under 16.
AppWizard
November 14, 2024
Xbox's latest AI transparency report for the first half of 2024 highlights significant advancements in moderation techniques that combine artificial intelligence with human oversight. During this period, Xbox blocked over 19 million pieces of content violating its Community Standards. The moderation system employs a dual AI approach for rapid identification and classification of inappropriate messages. Xbox is also implementing advanced moderation tools in games like Minecraft and Call of Duty to reduce toxic behavior. A new 'strike system' has been introduced, where players accumulate strikes for rule violations, with escalating consequences for repeated offenses. Severe violations can lead to immediate permanent suspensions or device bans. The report emphasizes the positive role of AI in enhancing player safety and reducing the burden on human moderators.
AppWizard
October 25, 2024
Roblox is implementing changes to its child safety features in November, requiring parental permission for children aged 13 and under to access chat functionalities and for players aged nine and below to engage with games featuring "moderate violence or crude humour." The platform will also introduce parental accounts for guardians to monitor their children's activities. These updates respond to a report from Hindenburg Research alleging insufficient measures against child predators, which Roblox has denied, asserting the platform's security.
AppWizard
August 22, 2024
Javed Richards, a twelve-year veteran of the Indianapolis Metro Police Department, has been accused of uploading over forty files containing child pornography using the messaging app Kik. He allegedly used a secured email domain to conceal his activities. Kik reported his suspicious behavior to the National Center for Missing and Exploited Children in July. Richards has been arrested and charged with twelve counts of child exploitation, and the police chief has recommended his termination. He is scheduled to appear in court on Thursday.
AppWizard
August 1, 2024
The Malaysian Communications and Multimedia Commission (MCMC) has established a regulatory framework requiring social media and messaging applications with at least eight million Malaysian users to obtain an annual license, effective January 1, 2025. This requirement does not apply to individual users. Major platforms affected include Facebook, Instagram, TikTok, WhatsApp, and others. The MCMC will use official surveys and publicly available data to determine eligibility for the license. Licensed platforms must protect user data, ensure child safety, address online harms, enhance advertisement transparency, and manage harmful content. The license is valid for one year, and failure to register by the deadline may lead to penalties, including fines up to RM500,000 or imprisonment for up to five years. There is a five-month grace period for service providers to apply. The MCMC can take action against license breaches, with responses ranging from warnings to prosecution.
Search