Aussie Government Puts Gaming Giants on Notice Over Child Safety Concerns
The Australian Government’s eSafety office has officially reached out to major gaming platforms like Roblox, Microsoft, Epic Games, and Valve, demanding a detailed account of how they are tackling the issues of child grooming and the spread of extremist content. This independent agency was first set up in 2015 to combat cyberbullying and the distribution of explicit material targeting minors, but its responsibilities have now grown to encompass a wider range of online safety concerns for all Australians.
In a recent announcement, the eSafety office issued legally binding transparency notices to these companies, citing ongoing worries that platforms like Roblox, Minecraft, Fortnite, and Steam are being misused by predators to engage with children and by extremist groups to disseminate violent ideologies and radicalize young players.
eSafety Commissioner Julie Inman Grant emphasized that many offenders begin their interactions with children in these online gaming spaces before shifting the conversation to private messaging platforms. She pointed out, “Gaming platforms are among the most popular online environments for Australian kids, serving not just as places for fun but also as social hubs.” According to their research, an impressive 90% of kids aged 8 to 17 in Australia have engaged with online games.
Inman Grant underscored the alarming reality that predatory adults are well aware of this popularity and are exploiting these platforms to groom children or introduce extremist ideologies through gameplay. She referenced various reports detailing grooming incidents across these platforms, as well as the presence of violent and extremist-themed games.
Examples she mentioned included games inspired by the Islamic State on Roblox, recreations of mass shootings, and extreme right-wing themes in titles on Minecraft. Even Fortnite has seen scenarios based on historical tragedies, such as concentration camps from World War II and the January 6 Capitol riots. Steam, too, has faced scrutiny for being a gathering ground for far-right communities, with the platform previously criticized for hosting numerous groups that support hate-based content.
With millions of children using these gaming platforms, Inman Grant stressed the necessity for these companies to implement stronger protective measures. She pointed out that compliance with the eSafety office’s transparency requests is mandatory, with potential penalties soaring up to AUD$825,000 per day for non-compliance.
In response, Roblox detailed the proactive measures they have in place to combat these issues. A spokesperson stated, “We appreciate the chance to work with eSafety on this critical matter. Our policies strictly ban any content that promotes or glorifies terrorism or extremism, and we take immediate action when such instances arise. We employ advanced AI technology to screen images, text, and avatar items before they go live to prevent any extremist content from being published.” They also mentioned the introduction of new age-based accounts for users under 16, which will provide enhanced content access controls and parental settings to better protect younger players.
While no system can ever be completely foolproof, Roblox remains committed to ensuring the safety of its users and actively collaborates with eSafety to achieve this shared objective of safeguarding Australian children.