A coalition of parents and attorneys is preparing to initiate multiple lawsuits against the child gaming platform Roblox, in light of a federal case that accuses the site of inadequately safeguarding children from sexual exploitation. This is not the company’s first legal battle, but it could be the most consequential.
Recently, Louisiana Attorney General Liz Murrill filed a suit against Roblox, claiming that the company “knowingly and intentionally” failed to implement sufficient safety protocols to protect young users from predatory conduct and child sex abuse materials (CSAM). In reply, Roblox issued a statement refuting these allegations, asserting: “We allocate significant resources to fostering a safe environment, which includes advanced technology and continuous human oversight, to identify and prevent inappropriate content and actions — not only because it matters to us but also due to its importance as a critical issue for our community.”
The initial lawsuit is being lodged by Dolman Law Group on behalf of parents and their minor children, with five complaints already filed. One complaint, submitted in the northern district of California, asserts that the company’s moderation strategies, like providing potentially suggestive avatar customizations and failing to recognize usernames containing hidden pedophilic language, permitted sexually exploitative games and predatory actions to flourish on the platform.
Recent scrutiny of the site’s safety protocols has spotlighted the effectiveness of its new open-source AI moderation system, “Sentinel,” which aims to proactively oversee chats and identify possible indicators of child endangerment, such as grooming. According to Roblox, Sentinel detected approximately 1,200 attempts at child exploitation in the first half of 2025, which were then reported to the National Center for Missing and Exploited Children (NCMEC).
A representative from the Dolman Law Group informed Wired that they are currently examining about 300 additional claims of sexual exploitation brought to their attention. A collaborating group of seven law firms is reportedly exploring hundreds more. Most complaints under review by the Dolman Law Group involve individuals under 16, with a significant number relating to young girls. Additional law firms are also said to be investigating the online messaging platform Discord.
In 2023, a collective of parents launched a class action lawsuit against Roblox, accusing the platform of “negligent misrepresentation and false advertising.” The complaint highlighted Roblox’s assertion that the child-centered platform was secure for young users, with plaintiffs alleging insufficient filtering and moderation policies. Other lawsuits have condemned Roblox’s in-game purchasing system, Robux, equating it to “illegal child gambling.”
“Safety is of utmost importance to us, and any suggestion that Roblox would intentionally jeopardize our users to exploitation is entirely false,” the company conveyed in an updated communication to Mashable. “We can confirm that many of the games referenced in the lawsuit breached our policies and were previously flagged and removed. We have instituted over 50 stringent safeguards, including facial age estimation, Trusted Connections, and enhanced technology and 24/7 human moderation. Our work is ongoing, and we will continue our efforts to thwart bad actors who attempt to bypass our systems. We share Attorney General Murrill’s commitment to taking measures that help ensure children are safe online and collaborating with families in Louisiana and across the globe.”
In the wake of about a dozen other lawsuits, Roblox commenced the rollout of a series of fortified security measures, including parental controls, restrictions on in-game chat, and even age verification for teenage users.
UPDATE: Aug. 18, 2025, 2:49 p.m. This report was updated with a statement from Roblox.