Apple — along with app developers — holds a duty to shield children from inappropriate content.


**The Risks Faced by LGBTQ+ Youth When Navigating Online Sexual Exploration and the Accountability of Major Tech Companies**

At 15, I covertly downloaded Grindr, posted a picture of my bare chest, and set up a profile using a pseudonym. At first, I felt apprehensive, yet my intrigue increased as I encountered handles like “looking for young” and “discreet hookups only.” Witnessing LGBTQ+ sexuality showcased so candidly was both comforting and enlightening—*Oh, there are gay individuals everywhere.*

For several months, I frequently removed and redownloaded the app, unclear about my desires. One evening, after a tiring day filled with school, dance, and homework, I heard that recognizable *Grindr notification sound*. A man I had spoken to the previous night was online again, encouraging me to meet him. Despite my doubts, his “now or never” messages persuaded me to stealthily exit through my bedroom window.

My parents were understanding and open-minded, raising my younger sister and me with a healthy mix of autonomy and accountability. Numerous parenting experts support this method, contending that excessively controlling “helicopter parenting” may lead children to rebel. In light of this, I think the desire to explore sexuality during adolescence is entirely natural.

### What Drives LGBTQ+ Teens to Seek Out Online Exploration?

In contrast to their heterosexual counterparts, who are often encouraged and even celebrated for their sexual exploration through movies, TV shows, and daily life, LGBTQ+ teens frequently find themselves lacking safe spaces for the same. As of 2023, one in four U.S. high school students identified as LGBTQ+, per the CDC. Nevertheless, many of these students face hostile school environments, complicating open exploration.

Feeling alone and outnumbered, LGBTQ+ teens frequently seek connection online—just like I did when I ventured out that night. Regrettably, this makes them three times more likely to experience unwanted or unsafe interactions in the digital realm.

That evening, as the man’s truck approached my street, anxiety crept in. I crouched behind a bush, deleted the app, and abstained from Grindr until my college years.

### Who Holds the Responsibility for Safeguarding LGBTQ+ Youth Online?

When narratives like mine emerge, individuals often grapple with who to hold responsible. Is it Grindr’s fault for permitting a minor access to an adult app? Were my parents to blame for being unaware of my download? Or could Apple have done more to ensure that a 15-year-old didn’t gain entry to an adult-only platform?

To ponder this, I think of my puppy, Stormy. If she fell ill due to a harmful chemical in a chew toy, my first inclination would be to question why the toy was not adequately tested for safety. Next, I would wonder why it was allowed for sale in the first place.

This same reasoning is often absent when technology companies deflect blame between app developers and marketplaces. Both Apple’s App Store and Grindr had a duty to ensure my safety.

### The Demand for Enhanced Protections

Grindr has introduced new safety and privacy measures since then, but many users report a diminishing experience. Meanwhile, Apple has put child safety protocols in place within its App Store, although they are lacking. While restricting kids from accessing inappropriate apps is a step forward, it doesn’t tackle developers who mislabel their products.

Apple promotes its App Store as a *“safe environment for children,”* yet a report I co-authored with *Heat Initiative* and *ParentsTogether Action* identified over 200 potentially dangerous or inappropriate apps categorized as suitable for children as young as four.

### A Call for Responsibility

I feel that Apple’s App Store will continue to pose a risk for youth unless it enforces independent evaluations of app ratings—much like how films, TV shows, and video games are rated by outside organizations. These specialists should evaluate risks to minors and allocate age ratings that prioritize their welfare, rather than the financial motives of Apple or the app developers.

Big Tech must be held accountable for fostering secure digital environments. App marketplaces require thorough reforms, and developers must prioritize safety, including stringent age verification. Numerous apps are accessible without needing a mobile application, highlighting the urgency for regulation.

Lives are on the line.

Whether it’s a mother like Kristin Bride, who lost her son due to cyberbullying on anonymous platforms, or the queer teenagers assaulted after sneaking out to meet individuals from hookup applications, Big Tech’s negligence is inflicting tangible harm. This must change.

**About the Author:**
Lennon Torres is an advocate for LGBTQ+ rights who gained national prominence as a young dancer on television. Driven by a passion for storytelling, advocacy, and politics, she currently focuses on online child safety at *Heat Initiative*, working to link digital security with LGBTQ+ representation.

[Connect with Lennon on LinkedIn](https://www.linkedin.com/in/lennon-torres-325b791b4/)