The Australian Government’s eSafety office has taken a proactive stance in addressing the pressing concerns surrounding online safety for children. In a recent move, the agency has formally requested that major gaming platforms, including Roblox, Microsoft, Epic, and Valve, provide detailed accounts of their measures to prevent child grooming and the dissemination of extremist content. Established in 2015, the eSafety office has evolved from its initial focus on combating cyberbullying to a broader mandate aimed at safeguarding all Australians from various online threats.
Transparency Notices Issued
In light of ongoing apprehensions regarding platforms such as Roblox, Minecraft, Fortnite, and Steam, the eSafety office has issued legally enforceable transparency notices. These notices are a response to alarming reports that these platforms may be exploited by sexual predators and extremist groups to groom children and spread radical ideologies. eSafety Commissioner Julie Inman Grant emphasized the gravity of the situation, noting that online gaming environments are frequently used by offenders to initiate contact with minors, often transitioning to private messaging services thereafter.
Inman Grant highlighted the significance of gaming platforms in the lives of Australian children, revealing that approximately 90% of children aged 8 to 17 engage with online games. This statistic underscores the urgent need for robust protective measures, as predatory adults are increasingly targeting these spaces to groom young users or embed extremist narratives within gameplay.
Disturbing Trends in Gameplay
Inman Grant referenced numerous media reports detailing incidents of grooming and the presence of extremist themes across the four platforms in question. Disturbing examples include games inspired by the Islamic State on Roblox, recreations of mass shootings, and far-right groups utilizing Minecraft to propagate fascist imagery. Additionally, Fortnite has seen gameplay centered around World War II concentration camps and events like the January 6 Capitol riot. The eSafety Commissioner also pointed out that Steam has become a hub for various extreme-right communities, which has raised concerns about the platform’s content moderation practices.
“These online game and gaming-adjacent platforms are used by millions of children, making it imperative for them to implement every possible measure to ensure safety and enhance existing safeguards,” Inman Grant stated. The eSafety office has made it clear that compliance with the transparency reporting notice is mandatory, with potential penalties reaching AUD5,000 per day for non-compliance.
Roblox’s Commitment to Safety
In response to the eSafety office’s concerns, Roblox has outlined several initiatives aimed at ensuring user safety. A spokesperson for the company expressed their willingness to engage with eSafety on this critical issue, stating, “Roblox has policies that strictly prohibit content or behavior that incites, condones, supports, glorifies, or promotes any terrorist or extremist organization or individual, which we work tirelessly to enforce.” The company has implemented advanced AI technology to review images, text, and avatar items prior to publication, aiming to prevent extremist iconography from appearing on the platform.
Roblox also encourages users to report any concerning content, emphasizing their collaboration with law enforcement and civil society groups to counteract violent extremism. Furthermore, the company announced plans to introduce new age-based accounts for users under 16, which will better align content access, communication settings, and parental controls with the user’s age. “While no system is perfect, our commitment to safety never ends, and we will continue to collaborate closely with eSafety on our shared goal of keeping Australian children safe,” the spokesperson concluded.
Luke is a Senior Editor on the IGN reviews team. You can track him down on Bluesky @mrlukereilly to ask him things about stuff.