
Discord has rolled out new safety functionalities aimed at assisting parents in overseeing their teenager’s online interactions, the chat platform revealed on Wednesday. The Family Center now provides parents or guardians with insights into the top five users a teen has messaged and called, the servers they commonly engage with, total call durations in both voice and video, as well as all purchases made. This tracking is confined to the preceding week, and if not accessed, it resets. To examine past activities, parents must refer to earlier email summaries.
Additional safety functionalities consist of teenagers having the option to inform parents when they report inappropriate content, alongside features that allow guardians to activate settings such as sensitive-content filters and controls for direct messages. Parents can choose whether their teen is permitted to receive direct messages from friends or other members of the server.
Savannah Badalich, Discord’s global head of product policy, mentioned that these features embody feedback gathered from parents and organizations such as the National Parent Teacher Association and the Digital Wellness Lab at Boston Children’s Hospital. “They sought greater visibility and increased control,” Badalich remarked.
The newly introduced features are set to launch within the upcoming week, with further safety enhancements anticipated for early next year. Discord is currently implementing age verification measures in the UK and testing them in Australia. Only users aged 13 and above are permitted, although many teens frequently bypass this limitation by falsifying their age.
Discord is favored by gamers for simultaneous chatting and gaming. However, it has come under scrutiny as a platform where predators may target young audiences. A lawsuit filed this year against Discord and Roblox claims they established a “breeding ground for predators.” The complaint involves an unnamed 11-year-old girl who was allegedly groomed and exploited by a perpetrator utilizing Roblox and Discord.
Dolman Law Group, responsible for filing the lawsuit, has included Discord in several complaints, including one submitted on Oct. 30, representing a parent whose child tragically died by suicide after being harassed by a predator through both Roblox and Discord. The lawsuit alleges that the companies misrepresented and concealed critical information regarding predatory behavior on their platforms.
Roblox has recently implemented age verification procedures for teens and introduced additional safety measures to safeguard children. While Badalich refrained from commenting on the lawsuit, she expressed that Discord adopts a “holistic view” concerning teen safety, actively identifying and flagging potentially harmful content and accounts. Discord’s policy prohibits any drawn or synthetic material depicting child sexual abuse.
Badalich recognized the conflict between offering teens privacy and equipping parents with oversight tools for safety. “Ultimately, our goal is to encourage discussions between teens and parents,” she stated. Resources available in the Family Center assist in facilitating conversations about online safety.