Roblox Upgrades Parental Controls with Enhanced Safety and Monitoring Features
Roblox is introducing new functionalities designed to provide parents with improved oversight and authority over their children’s activities on the platform. As part of a continuous initiative to enhance safety, the company has broadened its parental control options to offer more accurate blocking and monitoring capabilities.
Previously, parents could only limit access to entire content categories based on age ratings. Now, they’re able to block or report specific friends and unique experiences on their child’s account. Children below 13 can ask to unblock a friend or experience, with any modifications requiring parental approval.
In addition to these measures, Roblox is providing parents with weekly updates on their child’s online activities, which includes a summary of the top 20 most-visited experiences. Parents can delve into additional safety tools and educational materials via Roblox’s revised Safety Center.
Roblox has also launched an enhanced version of its open-source voice safety classifier—a large-scale model intended to identify and flag harmful voice interactions. This tool plays a crucial role in moderating millions of minutes of voice communication daily and is part of the expansive ROOST initiative, a collaborative child safety program endorsed by Roblox, Google, OpenAI, Discord, and other tech industry leaders.
“Empowering parents to block specific friends and experiences, alongside providing detailed activity reports, allows them to enhance their children’s safe engagement on Roblox,” commented Larry Magid, CEO of ConnectSafely, a nonprofit dedicated to online safety that collaborates with Roblox. “Safety, fun, and adventure can coexist.”
These updates follow a set of safety improvements revealed in November, which featured ID-verified parent accounts, limits on screen time, and messaging restrictions for users under 13. Roblox has also prohibited younger users from directly messaging others or communicating during gameplay.
The platform has encountered increased scrutiny from parents and child safety advocates concerning inappropriate content and online dangers. In 2023, a coalition of parents initiated a lawsuit against Roblox, alleging negligent misrepresentation and false advertising about the platform’s safety for children.
In reaction, Roblox has pledged to enhance its safety framework. As per Chief Safety Officer Matt Kaufman, the company has executed over 40 safety updates in 2024 alone.
Should you or someone you know encounter or observe child exploitation online, you can report it to the CyberTipline, managed by the National Center for Missing & Exploited Children. Visit report.cybertip.org for further information.