New Jersey has initiated legal action against the popular messaging platform Discord, claiming that the company’s measures to protect young users are insufficient and ineffective. Attorney General Matt Platkin made the announcement during a press conference held in Newark, where he, alongside Cari Fais, the director of the New Jersey Division of Consumer Affairs, outlined serious concerns regarding the app’s safety features.
The lawsuit alleges that lapses in Discord’s safety protocols have exposed children to violent content, harassment, and even sexual abuse. “Discord built this massive user base by touting its application as a ‘safe space for teens,’ claiming that it ‘makes [its] products safe spaces by design and default,’” the lawsuit states. “However, Discord has misled children and parents about the effectiveness of the application’s safety features, leaving children vulnerable to harassment, abuse, and sexual exploitation by predators who lurk on the platform.”
With over 200 million monthly active users, Discord has become a favored platform among gamers, offering text messaging, voice, and video chat functionalities. In response to the lawsuit, a spokesperson for Discord expressed surprise at the legal action and affirmed the company’s commitment to enhancing safety on the platform. “Discord is proud of our continuous efforts and investments in features and tools that help make Discord safer,” the spokesperson stated. “We dispute the claims in the lawsuit and look forward to defending the action in court.”
Concerns Over Safety Features
The lawsuit specifically targets several features that New Jersey argues are prone to misuse. One major point of contention is Discord’s age-verification process, which the state claims is ineffective. Although the platform advertises a minimum age requirement of 13, many children below that age reportedly manage to misrepresent their age and gain access.
“Simple verification measures could have prevented predators from creating false accounts and kept children under 13 off the app more effectively,” the attorney general’s press release noted. “Nevertheless, Discord actively chose not to bolster its age verification process for years and has allowed children under the age of 13 to operate freely on the app, despite their vulnerability to sexual predators.”
Additionally, the lawsuit contends that the app’s “Safe Direct Messaging” feature did not function as promised. While Discord claimed that this tool would “scan” messages for explicit content, the state argues that it was ineffective, allowing much explicit material to go undetected and unremoved.
New Jersey’s legal actions against tech companies are not limited to Discord; the state has previously filed lawsuits against Meta and TikTok. “Simply put, Discord has promised parents safety while simultaneously making deliberate choices about its app’s design and default settings, including Safe Direct Messaging and age verification systems, that broke those promises,” the release states. “As a result of Discord’s decisions, thousands of users were misled into signing up, believing they or their children would be safe, when they were really anything but.”