Adolescent Instagram Profiles: Merely Public Relations?


On the day Meta revealed its newest AI devices, Brandy Roberts stood outside its headquarters grieving for her daughter Englyn, who was only 14 when she died after viewing a “how-to” suicide video on Instagram. Brandy was not there as an activist but as a mourning mother seeking answers. Inside, Mark Zuckerberg grappled with live demonstrations of buggy smart glasses and AI solutions. Outside, mourning families called for accountability. Meta’s silence spoke volumes: Growth over sorrow, products over protection, appearances over safety.

Meta’s shortcomings are not recent. In 2019, around 440,000 minors received follower requests from accounts flagged earlier for predatory conduct. Since then, the company has attempted to persuade users and lawmakers that it can self-regulate, launching marketing initiatives like Instagram Teen accounts despite increasing evidence to the contrary.

Instagram Teen Accounts were promoted as a leap forward in youth safety, featuring AI age verification, nudity filters, and location alerts. However, independent evaluations revealed that only 8 out of 47 safety tools were effective. Teens continued to encounter sexualized material, self-harm images, and predatory activities. Meta’s updates appear more focused on perception than actual protection.

Recent reporting from Heat Initiative, ParentsTogether Action, and Design It For Us uncovers the troubling reality of the teen experience on Instagram Teen accounts. A survey of 800 users aged 13–15 discovered that nearly half had encountered unsafe content or unwanted messages in just the past month. Half reported that Instagram’s algorithm suggested suspicious adult-run accounts, and 65 percent had not received a single “take a break” notification — a feature Meta advertises as a safeguard for screen time. These findings highlight a troubling trend: Meta’s promotional messages promise security, but the real experiences of young users reveal a continual exposure to danger. The gap between marketing and reality is not just deceptive — it is perilous.

Now, Heat Initiative and ParentsTogether Action have published a video illustrating the type of content teens receive on Instagram Teen accounts — content so inappropriate that even using it for advocacy raises ethical concerns. Watching these clips, I experienced the same unease that Meta should feel each time its algorithm exposes millions of young users to similar materials. If it feels wrong to present these videos to adults for advocacy, why does Meta deem it acceptable to distribute them to children on a large scale?

Companies like Meta will persist in exploiting their users — particularly children — until we, the users, reclaim our authority and demand a better online community. One that prioritizes connection and societal benefit over profit.

I departed from Instagram after observing parents like Brandy protesting in NYC. It wasn’t an easy choice — most of my friends chose to stay. But every month, I’m reminded that I can select platforms that respect me, not exploit me. I do this for my younger self, for future generations, for the survivors I cherish, and for the children who cannot be present.

So when you next see organizations or influencers collaborating with Meta, question yourself: Is this about safety or appearance? Just yesterday, Meta asserted that its Instagram Teen experience would now be “guided by PG‑13 movie ratings.” However, the Motion Picture Association promptly clarified that it was never consulted and labeled Meta’s statement “inaccurate.” Once again, Meta borrowed credibility it hasn’t achieved, utilizing trusted labels to disguise ongoing harm. When public relations become the product and partnerships serve as shields, we owe it to ourselves — and our children — to scrutinize closely.

This article reflects the opinion of the writer.

Lennon Torres is a Public Voices Fellow on Prevention of Child Sexual Abuse with The OpEd Project. She is an LGBTQ+ advocate who grew up in the spotlight, gaining national attention as a young dancer on television shows. With a strong passion for storytelling, advocacy, and politics, Lennon currently focuses on centering the lived experiences of herself and others as she shapes her professional trajectory in online child safety at Heat Initiative. The views expressed in this piece are those of Lennon Torres as an individual and do not represent the entities she is affiliated with. Lennon’s substack: https://substack.com/@lennontorres1