Australian Government Warns That Steam, Minecraft, Roblox, and Fortnite May Lead to Abuse, Extremist Violence, Radicalization, and Long-Term Harm

Australian Government Warns That Steam, Minecraft, Roblox, and Fortnite May Lead to Abuse, Extremist Violence, Radicalization, and Long-Term Harm

### Australian eSafety Commissioner Issues Transparency Notices to Major Gaming Companies

In a significant move towards enhancing child safety in online gaming, the Australian government’s eSafety commissioner has issued transparency notices to notable companies including Valve, Epic Games, Microsoft, and the Roblox Corporation. This action aims to investigate the measures these companies are taking to protect children using platforms like Steam, Fortnite, Minecraft, and Roblox. The government’s concern stems from the potential for these platforms to become pathways to abuse, extremist violence, radicalization, or lifelong harm.

#### Background and Context

In a press release, Julie Inman Grant, the eSafety commissioner and former global director of privacy and internet safety at Microsoft, highlighted alarming trends in online gaming. She noted that predatory adults target children through methods such as grooming and embedding extremist narratives within gameplay. The commissioner pointed out that there have been multiple media reports of grooming activities and violent portrayals across these platforms, including:

– **Roblox**: Instances of Islamic State-inspired games and recreations of mass shootings.
– **Minecraft**: Far-right groups using the platform to recreate fascist imagery.
– **Fortnite**: Games that gamify events such as the Jasenovac concentration camp and the January 6th Capitol Building riots.
– **Steam**: Noted as a hub for extreme-right communities.

This alarming content raises questions about the responsibility of these gaming companies in safeguarding young users from harmful influences.

#### Government Response

The Australian government is taking proactive steps to ensure that these platforms implement effective measures to prevent grooming, extremism, and other forms of online abuse. The eSafety commissioner emphasized the necessity for meaningful actions that will protect children and prevent these platforms from being used as vehicles for harm.

#### Actions Taken by Companies

In light of previous criticisms regarding child safety, several of these gaming companies have already initiated measures to address these issues:

– **Roblox Corporation** has limited access to social hangouts and unrated games for users under 13 years old. They have also introduced selfie-based “facial age estimation technology” to better ensure age verification and protect younger players.
– Other companies may also have varying levels of measures in place, aiming to enhance the safety of their gaming environments.

#### Future Considerations

While existing actions are a step in the right direction, the Australian government is looking for additional insights and measures from these companies. The expectation is that the gaming giants will provide a comprehensive overview of their strategies to combat grooming and extremism.

As the situation develops, the emphasis seems to be on transparency, with companies likely to furnish detailed accounts of their past and planned initiatives to improve safety. This proactive engagement is crucial in addressing the pressing risks children face in the online gaming world and fostering a safer gaming experience for all users.

In conclusion, the engagement by the Australian government highlights the critical need for enhanced oversight and safety protocols in the rapidly evolving landscape of online gaming, ensuring that these platforms do not inadvertently facilitate harmful behaviors.