Roblox announces new age-related safety measures
SAN MATEO, Calif. - Roblox on Thursday announced new safety measures that will require users to verify their age before accessing text or voice chat on the platform, a move the company says is aimed at better protecting its youngest players.
The San Mateo–based gaming platform, which has more than 150 million users worldwide and a large audience of children, said age checks will rely in part on artificial intelligence technology capable of estimating a user’s age from a photo or video of their face. Young users will then be placed into "age bins" that limit who they can communicate with.
"When we think about age checking and sort of who can talk to who, we think of it really in sort of elementary school buckets," said Eliza Jacobs, Roblox’s senior director of product policy. "Middle schoolers can talk to middle schoolers, high schoolers can talk to high schoolers, but not outside of those buckets."
Jacobs said online safety requires a broader, industrywide effort.
"Safety is an industry-wide problem," she said. "Kids are not just on one app, they’re on all of the apps. And so we can make Roblox as safe as we can, but to really keep kids safe, we need the whole industry to get behind this."
Roblox is facing lawsuits in several states from families who say the company has failed to protect children from predators who use the platform’s chat features.
The company said the new verification requirements for chat will take effect next month.