Law enforcement officials in New Jersey have initiated a lawsuit against the popular messaging platform Discord, alleging that the company misled parents regarding the effectiveness of its safety controls and obscured the risks that children face while using the application.
New Jersey Attorney General Matthew Platkin stated that the creators of Discord have violated the state’s consumer protection laws, thereby exposing children in New Jersey to sexual and violent content, which leaves them vulnerable to online predators. “Discord markets itself as a safe space for children, despite being fully aware that the application’s misleading safety settings and lax oversight has made it a prime hunting ground for online predators seeking easy access to children,” Platkin remarked. “These deceptive claims regarding its safety settings have allowed Discord to attract a growing number of children to use its application, where they are at risk. We intend to put a stop to this unlawful conduct and hold Discord accountable for the harm it has caused our children.”
The complaint, filed on Thursday, alleges multiple violations of the New Jersey Consumer Fraud Act by Discord.
Discord ‘could not and did not’ protect young users
The lawsuit claims that Discord was aware its safety features and policies were inadequate in protecting its young user base yet chose not to improve them. Platkin’s office specifically pointed out that Discord misled both parents and children about the safety settings for direct messages.
According to court documents, a year-long investigation revealed that, despite Discord’s claims of safety and its Safe Direct Messaging feature, which was purported to automatically scan and delete explicit media content, the app failed to deliver on these promises. “News accounts and reports from prosecutors’ offices illustrate that despite the app’s promises of child safety, predators use the app to stalk, contact, and victimize children,” Platkin’s office stated. They highlighted alarming cases where adults, often posing as children, were charged with using Discord to solicit explicit images and engage in sextortion. Many of these victims were under the age of 13, despite Discord’s stated policy prohibiting access to children in that age group.
Lawsuit claims child users ‘inundated with explicit content’
Among the various concerns raised, Platkin’s office noted that Discord offers features such as custom emojis, stickers, and soundboard effects designed to enhance engagement for younger users. However, these same features can be exploited by predators to form online “friendships” with underage users. Consequently, Platkin’s office asserted that “child users can be—and are—inundated with explicit content.”
Furthermore, the lawsuit criticized the app’s “Safe Direct Messaging” feature, claiming that it does not effectively detect or delete all explicit content as promised. “Simply put, Discord has promised parents safety while simultaneously making deliberate choices about its app’s design and default settings, including Safe Direct Messaging and age verification systems, that broke those promises,” Platkin’s office stated. “As a result of Discord’s decisions, thousands of users were misled into signing up, believing they or their children would be safe, when they were really anything but.”
The lawsuit seeks various remedies, including an injunction to prevent Discord from violating New Jersey’s Consumer Fraud Act, civil penalties, and the forfeiture of any profits generated within the state. As of now, Discord representatives have not responded to requests for comment following the announcement of the lawsuit.