Skip to main content

BBC NEWS·

Roblox Age Verification Errors Raise New Safety Concerns

10 min listenBBC News

Roblox’s new age-checking system aims to boost child safety, but technical errors are sparking concerns about the effectiveness of these controls.

Transcript
AI-generatedLightly edited for clarity.

From DailyListen, I'm Alex

HOST

From DailyListen, I'm Alex. Today: Roblox is under fire for its new, mandatory age-check system. While the company calls it a "safety gold standard," many parents and developers are reporting serious errors that could actually leave kids less protected. To help us understand, we're joined by Catherine, our legal analyst.

CATHERINE

Thanks for having me, Alex. It’s a complex situation. Essentially, Roblox is trying to move away from self-reported ages—which are notoriously unreliable—to a more automated, AI-driven model. They’re using facial age estimation technology to scan users and place them into age bands. The goal is to limit interactions between adults and children, which is a major regulatory concern globally. As of late January, about 45% of their 150 million daily active users had gone through some form of verification. They’re essentially forcing this because, under their new rules, you can't access chat features without passing an age check. They frame this as a necessary step to secure the platform, especially with the introduction of new account types like "Roblox Kids" and "Roblox Select." However, the transition has been anything but smooth, and the friction between the company’s stated safety goals and the user experience on the ground is where the real conflict lies right now.

HOST

Wow, that’s a massive shift for a platform that’s been around so long. So basically, if you want to talk to friends, you have to let their AI guess your age by scanning your face? But I’m curious, how does this technology actually work, and why are parents reporting so many errors?

CATHERINE

That’s the core of the frustration. The technology relies on AI models that analyze facial features to estimate a user’s age. It’s not just a simple photo upload; it’s a sophisticated process intended to prevent people from lying about being older than they are. But AI, as we know, isn't perfect. When the system misclassifies a user, the consequences are immediate. If a child is incorrectly categorized as an adult, they might lose the protective guardrails meant for younger accounts. Conversely, if an adult is tagged as a child, they’re locked out of features they should have access to. Roblox has stated that they have continuous background systems running to catch these inconsistencies, and they do have an appeals process. Parents can reset checks or submit corrections. But for a parent whose child is suddenly put into an "adult" bucket, the damage is already done, and the burden of fixing that error falls squarely on the family, not the company.

HOST

That sounds incredibly frustrating for families just trying to play a game safely. It seems like the tech might be creating more problems than it solves. If this is supposed to be the "gold standard," why are so many developers and parents calling for a rollback instead of praising it?

CATHERINE

The pushback is intense because, for many, this feels like a blunt instrument being used to solve a nuanced problem. Thousands of developers have voiced concerns on Roblox’s own forums, arguing that the system is easily bypassed or simply doesn’t work as advertised. Beyond the technical errors, there’s a deeper, more philosophical argument happening. Critics, including legal experts like Alexandra Walsh from the firm Anapol Weiss, argue that these incremental, AI-based measures are just a distraction. They contend that Roblox has spent years prioritizing growth and engagement over meaningful safety design. From their perspective, no amount of facial scanning can fix fundamental issues if the platform’s core design allows for easy grooming or interaction with predators. They see this move as reactionary—a way to satisfy regulators while failing to address the structural risks that have been baked into the platform’s architecture for years. It’s a clash between a company wanting to use tech to scale safety and critics demanding a total design overhaul.

That really changes the picture

HOST

That really changes the picture. It’s not just about a buggy facial scan; it’s about whether the company is actually fixing the roots of the problem. If they’re under this much pressure, what’s their defense? Are they just ignoring the complaints, or are they trying to fix the actual system?

CATHERINE

Roblox is definitely in defensive mode. Company representatives, like chief safety officer Matt Kaufman, have acknowledged that with 150 million daily users, it’s a difficult process and they are focused on improving the verification system. They point to the fact that over half of their daily users in countries like Australia, New Zealand, and the Netherlands have already completed the check, suggesting that the system is functional for the majority. They’re also emphasizing that this data is crucial for creating a safer, age-appropriate environment. However, there’s another layer here. During recent earnings calls, CEO David Baszucki noted that the data collected from these checks will also be used to grow revenue. This is where the skepticism intensifies. When a company claims a feature is for "safety," but also admits it helps with business growth, it’s bound to raise eyebrows. Parents aren't just worried about AI errors; they’re worried about how their children’s biometric and age-related data is being used by a massive, profit-driven corporation.

HOST

I see. So it’s a mix of safety, data collection, and profit goals, which makes it hard to trust the company’s intentions. And you mentioned that this is part of a larger, global trend. Are other tech firms facing this same kind of pressure, or is Roblox the main target right now?

CATHERINE

Roblox is certainly a focal point, but they’re part of a much wider trend. Tech firms globally are facing mounting pressure from regulators to implement stricter age verification. The goal is to keep children away from adult content and, more importantly, to prevent adults from grooming minors. The problem is that there’s no universally accepted, perfect way to do this online. Some platforms rely on ID uploads, others on credit card verification, and now Roblox is pushing facial estimation. Each method has its own set of privacy and accuracy trade-offs. The BBC has highlighted that this is really about the growing demand for tech platforms to take responsibility for their environments. But because Roblox is a social gaming company—where interaction is the product—the stakes feel higher. If you mess up age verification on a social media feed, it’s bad. If you mess it up in a virtual space where kids are actively chatting and playing, the potential for harm is immediate and direct. [CLIP_START]

HOST

It sounds like we’re in this weird middle ground where the tech is too young to be reliable, but the pressure to be safe is too high to wait. So, what’s the bottom line for a parent today? Should they just trust the system, or is there a way to opt out of this?

CATHERINE

That’s the difficult reality. Right now, if you want to use the chat features, you’re effectively forced into this system. There isn’t a simple "opt-out" that keeps all the functionality intact. For parents, the best advice is to stay incredibly vigilant. Don't assume the "verified" status is 100% accurate. If you notice your child’s account has been misclassified—perhaps they’re seeing content they shouldn't, or they’re being treated like an adult—you have to use the appeal processes. It’s a manual, tedious task, but it’s currently the only way to correct these errors. The "safety gold standard" is a marketing term, not a guarantee of perfection. You have to treat the platform’s own safety tools as one layer of protection, not the entire solution. Until the AI matures and the company addresses these structural concerns, the primary responsibility for navigating these risks still falls on the parents. [CLIP_END]

That’s a sobering takeaway

HOST

That’s a sobering takeaway. It really feels like we’re being asked to trade our privacy and handle these technical glitches just to keep our kids safe on a platform they love. What does the future look like? Is this just the new normal for online gaming?

CATHERINE

We’re definitely moving toward a more gated internet. The days of anonymous, open-access gaming are rapidly disappearing, especially for younger users. In the near term, expect Roblox to continue refining its AI models. They’ll likely collect more data, not less, to improve the accuracy of their facial estimation. You’ll also see more integration between these age-check systems and parental control dashboards. The company is betting that if they can make the system "good enough" for the vast majority of users, the noise from the errors will eventually die down. But the legal pressure isn't going away. Law firms like Anapol Weiss are continuing to push for deeper, structural changes, and if the lawsuits continue to gain traction, they might force Roblox to move beyond just age-checking. We could see mandates for better moderation, changes to game design, or even stricter limits on how users interact, regardless of their age. It’s a high-stakes experiment in balancing safety, growth, and user privacy.

HOST

That was Catherine, our legal analyst. The big takeaway here is that while Roblox is pushing its new AI-based age verification as a "safety gold standard," the reality is much messier. The system is prone to errors, which can leave children in less-protected, adult-level accounts. And because the company has linked these checks to revenue growth, parents are rightly skeptical about whether this is truly for the kids or just for the business. It’s a reminder that tech solutions are rarely a silver bullet. We’re still in a world where parental oversight is the only real line of defense. I'm Alex. Thanks for listening to DailyListen.

Sources

  1. 1.Here's how Roblox's age checks work | TechCrunch
  2. 2.[February 6, 2026] An Update on Our Age Check to Chat Fast-Follow Roadmap - Page 11 - Announcements - Developer Forum | Roblox
  3. 3.Roblox defends expanded safety checks but parents are concerned
  4. 4.Roblox Requires Users Worldwide to Age-Check to Access Chat
  5. 5.Roblox’s AI Age Verification Fails to Protect Children, Says Anapol Weiss
  6. 6.Roblox Age-Verification Faces Widespread Criticism 01/16/2026
  7. 7.Roblox Introduces New Age-Based Accounts and Expanded Parental Controls for Users Under 16 | Roblox
  8. 8.Roblox defends expanded safety checks but parents are concerned
  9. 9.Roblox defends expanded age‑checks after parents raise concerns over errors
  10. 10.Roblox CEO faces backlash after comments on child safety and online predators - GamesBeat

Original Article

Roblox defends expanded age‑checks after parents raise concerns over errors

BBC News · April 13, 2026

Roblox Age Verification Errors Raise New Safety Concerns | Daily Listen