Steam, Fortnite, Minecraft and Roblox are now the focus of a transparency push from Australia’s eSafety commissioner, who has sent notices to Valve, Epic Games, Microsoft and the Roblox Corporation. The request asks what each company is doing to keep kids safe on those platforms, and the Australian government says the stakes are high: without action, they risk “becoming onramps to abuse, extremist violence, radicalisation or lifelong harm.” For players and parents, this is more than a policy slap on the wrist. It could shape how these services moderate chats, age-gate access, and police the kinds of user-made content that keep drawing headlines.

The notices have not been tied to a public release date for any new rules, but the companies are already under pressure to explain their current safeguards. Julie Inman Grant, Australia’s eSafety commissioner and a former global director of privacy and internet safety at Microsoft, said predatory adults “target children through grooming or embedding terrorist and violent extremist narratives in gameplay.” That matters because the allegations cover four of the biggest social gaming platforms at once, which means this isn’t about one bad corner of one game. It’s about whether the systems around these games can keep up with the way people actually use them.

About Steam, Fortnite, Minecraft and Roblox

The Australian government’s notices went to Valve, Epic Games, Microsoft and the Roblox Corporation, with the eSafety commissioner asking for information about Steam, Fortnite, Minecraft and Roblox. The platforms sit on PC, and the source does not mention any other release platforms or region-specific rollouts for this move. That broader reach is why the story matters so much: these are not niche services, but big social spaces where voice chat, user content and public lobbies can make moderation feel like a constant firefight.

Julie Inman Grant said the government’s concern came after “numerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay.” She pointed to Islamic State-inspired games and recreations of mass shootings on Roblox, far right groups recreating fascist imagery in Minecraft, and reports that Fortnite hosted games gamifying the WWII Jasenovac concentration camp and the January 6th US Capitol Building riots. Grant also said Steam is reportedly a hub for a number of extreme-right communities. That combination of claims explains why the government wants transparency instead of vague promises.

What Roblox and Epic Say They’re Already Doing

Roblox was quick to say it welcomes the regulator’s attention. A Roblox spokesperson said, “We welcome engagement with eSafety on this important topic,” and added that Roblox “strictly prohibit[s] content or behaviour that incites, condones, supports, glorifies, or promotes any terrorist or extremist organisation or individual.” The company said it removes such content quickly and takes immediate account-level action when it finds it. In practical terms, that means Roblox is leaning hard on moderation rather than waiting for reports to pile up after the fact.