Lawsuit Claims Roblox Enables Widespread Sexual Exploitation of Minors


Roblox is once again facing criticism from advocates for online child safety, as it confronts a lawsuit that alleges the platform prioritizes earnings over protecting children. The lawsuit, initiated by Louisiana Attorney General Liz Murrill, asserts that the platform has “knowingly and intentionally” failed to enforce “basic safety controls,” thus putting young users at risk of predatory actions and child sexual abuse materials. Murrill further contends that the platform does not sufficiently inform parents about the potential risks their children encounter on Roblox.

In a series of tweets, Murrill claimed that the platform was “perpetuating violence against children and sexual exploitation for profit,” labeling many user-created gaming environments on the site as “obscene garbage.” She posted images of purportedly public game experiences available on the platform, such as “Escape to Epstein Island” and “Public Showers.” Similar legal measures have been initiated against other well-known social media platforms like Meta, TikTok, and Snapchat, amid rising worries regarding youth safety and mental well-being online.

Roblox has been striving to enhance its reputation after reports indicated that the site posed dangers to young children due to a network of predatory adult users. In 2023, a class action lawsuit was brought against the platform by parents, claiming the company misleadingly advertised its site as secure for minors.

Since then, Roblox has rolled out a variety of new safety features, including comprehensive blocking tools, parental supervision, and messaging controls. The platform recently adopted selfie-based age verification for teenage users. In the lawsuit, Murrill asserts that the absence of age verification policies enables predators to engage with children on the site. Earlier this year, Roblox joined other social media firms in endorsing the recently enacted Take It Down Act, which establishes rules for removing and penalizing the distribution of non-consensual intimate images, including deepfakes.