Steam, Fortnite, Minecraft and Roblox are now the focus of a transparency push from Australia’s eSafety commissioner, who has sent notices to Valve, Epic Games, Microsoft and the Roblox Corporation. The request asks what each company is doing to keep kids safe on those platforms, and the Australian government says the stakes are high: without action, they risk “becoming onramps to abuse, extremist violence, radicalisation or lifelong harm.” For players and parents, this is more than a policy slap on the wrist. It could shape how these services moderate chats, age-gate access, and police the kinds of user-made content that keep drawing headlines.

The notices have not been tied to a public release date for any new rules, but the companies are already under pressure to explain their current safeguards. Julie Inman Grant, Australia’s eSafety commissioner and a former global director of privacy and internet safety at Microsoft, said predatory adults “target children through grooming or embedding terrorist and violent extremist narratives in gameplay.” That matters because the allegations cover four of the biggest social gaming platforms at once, which means this isn’t about one bad corner of one game. It’s about whether the systems around these games can keep up with the way people actually use them.

About Steam, Fortnite, Minecraft and Roblox

The Australian government’s notices went to Valve, Epic Games, Microsoft and the Roblox Corporation, with the eSafety commissioner asking for information about Steam, Fortnite, Minecraft and Roblox. The platforms sit on PC, and the source does not mention any other release platforms or region-specific rollouts for this move. That broader reach is why the story matters so much: these are not niche services, but big social spaces where voice chat, user content and public lobbies can make moderation feel like a constant firefight.

Julie Inman Grant said the government’s concern came after “numerous media reports about grooming taking place on all four of these platforms as well as terrorist and violent extremist-themed gameplay.” She pointed to Islamic State-inspired games and recreations of mass shootings on Roblox, far right groups recreating fascist imagery in Minecraft, and reports that Fortnite hosted games gamifying the WWII Jasenovac concentration camp and the January 6th US Capitol Building riots. Grant also said Steam is reportedly a hub for a number of extreme-right communities. That combination of claims explains why the government wants transparency instead of vague promises.

What Roblox and Epic Say They’re Already Doing

Roblox was quick to say it welcomes the regulator’s attention. A Roblox spokesperson said, “We welcome engagement with eSafety on this important topic,” and added that Roblox “strictly prohibit[s] content or behaviour that incites, condones, supports, glorifies, or promotes any terrorist or extremist organisation or individual.” The company said it removes such content quickly and takes immediate account-level action when it finds it. In practical terms, that means Roblox is leaning hard on moderation rather than waiting for reports to pile up after the fact.

The company also said it uses “advanced AI technology” to review all images, text and avatar items before publishing, which should help stop obvious extremist symbols from going live. Roblox said it will soon introduce new age-based accounts for children under 16, and those accounts will more closely align content access, communication settings and parental controls with a user’s age. It also said it has moved to limit access to social hangouts and unrated games for children under 13, and that it has brought in selfie-based facial age estimation technology in recent years. That’s a sensible direction, because age gating only works when the platform can tell who’s actually using the account.

ℹ️ Note: Roblox said it will soon introduce new age-based accounts for children under the age of 16, with content access, communication settings and parental controls tied more closely to age.

Epic’s Fortnite Safety Tools Under the Microscope

Epic Games, through senior communications manager Cat McCormack, said its rules “prohibit extremism, child endangerment, dangerous or illegal activities and threats of real world violence.” McCormack added that the specific Fortnite islands named in the Australian government’s press release had action taken against them in 2024. That’s a decent sign that Epic can move when it spots a problem, but it also raises the obvious question: why did those islands need naming in the first place? User-made content lives and dies on speed, and safety claims only carry weight when the takedowns arrive before the damage spreads.

McCormack also said, “Epic’s text chat filters remove mature language including hate speech, and our systems automatically report potentially high-harm interactions in text chat with players under 18 so we can take action.” In plain terms, that means Fortnite’s systems try to catch toxic or risky conversations before they spiral. She added that “Fortnite has built-in protections for younger players including high-privacy default settings for players under 18 and voice and text chat are off for players under 16 until a parent consents.” Parents can also use Epic’s Parental Controls to customise who their child can communicate with, which gives families more control than a lot of online games bother to offer.

Key Takeaways

  • Australia’s eSafety commissioner issued transparency notices to Valve, Epic Games, Microsoft and the Roblox Corporation.
  • The notices ask what steps are being taken to keep kids safe on Steam, Fortnite, Minecraft and Roblox.
  • Julie Inman Grant said the platforms risk “becoming onramps to abuse, extremist violence, radicalisation or lifelong harm.”
  • Roblox said it uses advanced AI to review images, text and avatar items before publishing and will soon introduce new age-based accounts for children under 16.
  • Epic said Fortnite has high-privacy default settings for players under 18, with voice and text chat off for players under 16 until a parent consents.

What This Means for Players

This feels like a smart and overdue pressure test. Steam, Fortnite, Minecraft and Roblox all rely on player creativity and social interaction, which also gives bad actors room to hide in plain sight if moderation slips. The Australian government is asking the right question here: not whether these platforms have policies, but whether those policies actually protect children when grooming, extremist material and abuse start circulating inside real communities. That’s the part companies usually avoid talking about unless a regulator drags them into the light.

Roblox and Epic both sound prepared to make the case that they already have systems in place, and some of those systems are clearly more specific than the usual corporate safety boilerplate. Still, the government’s framing suggests those measures haven’t convinced everyone yet. If the companies respond with better age checks, tighter chat controls and faster takedowns, players could see safer spaces without losing the parts of these platforms that make them popular. If they don’t, expect this issue to keep coming back every time another ugly user-made island or community surfaces.

For now, the next thing to watch is what Valve, Epic Games, Microsoft and the Roblox Corporation tell eSafety in response to the notices. The real test will be whether those replies read like genuine fixes or just polished paperwork. Either way, this is no longer a background moderation issue. Australia has put Steam, Fortnite, Minecraft and Roblox on the record, and the companies now have to answer with more than good intentions.