Australia’s eSafety regulator has taken a significant step in safeguarding children in the digital realm by requesting gaming companies, including industry giants like Microsoft and Roblox, to clarify their measures against sexual exploitation and radicalisation. The eSafety Commissioner’s office has issued legally enforceable transparency notices to several prominent gaming platforms, such as Roblox, Minecraft, Epic Games’ Fortnite, and Valve’s Steam, demanding comprehensive details regarding their safety systems, staffing, and cybersecurity protocols. This initiative was announced on Wednesday, April 22.
Julie Inman Grant, the eSafety Commissioner, emphasized the pivotal role gaming platforms play in the social lives of children, revealing that a staggering nine out of ten Australians aged eight to seventeen engage in online gaming. However, she also highlighted the inherent risks associated with these platforms.
“What we often see after these offenders make contact with children in online game environments is that they then move children to private messaging services,” Inman Grant stated. This transition to encrypted messaging can serve as a gateway for offenders involved in grooming, sexual extortion, and radicalisation.
“Predatory adults are aware of this dynamic and specifically target children through grooming or by embedding terrorist and violent extremist narratives within gameplay, thereby escalating the risks of contact offending, radicalisation, and other off-platform harms,” she added.
Gaming companies are now under pressure to comply with these notices, with non-compliance potentially leading to penalties and civil action. Microsoft has acknowledged the notice and is currently reviewing it, reiterating its commitment to children’s online safety. A spokesperson conveyed via email, “We continue to evolve our approach to meet the evolving threat and regulatory landscape.” Meanwhile, Roblox has yet to respond to inquiries regarding the matter.
This regulatory action comes at a time of heightened scrutiny surrounding how gaming platforms identify and mitigate threats to minors, particularly as real-time user interactions present challenges that differ from traditional social media moderation.
Roblox Lawsuits
Roblox, one of the companies under the Australian regulator’s microscope, is currently navigating a series of legal challenges. Recently, the company reached settlements in the U.S. states of Alabama and West Virginia, agreeing to pay over million in response to allegations of failing to protect children and committing to implement changes to its chat and gaming features.
In addition to these settlements, Roblox faces more than 140 lawsuits in U.S. federal courts, accusing it of knowingly facilitating child sexual exploitation. In light of these ongoing legal issues, Roblox announced plans to introduce tailored accounts for younger users starting in June, categorizing children aged five to eight under “Roblox Kids” and those aged nine to fifteen under “Roblox Select.”
Read more: