As the pressure escalates from worried parents and legislators regarding the safety of minors online, a significant legal conflict is developing in Washington, D.C., focused on a seemingly straightforward yet deeply impactful question: Who holds the responsibility for verifying user ages on social media and other online platforms?
A report by Emily Birnbaum for Bloomberg indicates that this argument is intensifying with the emergence of a new lobbying entity — supported by major technology firms like Meta, Spotify, and Match Group (the parent organization of Tinder and Hinge) — joining the conversation. This group, known as the Coalition for a Competitive Mobile Experience, advocates for a change in accountability: They claim that Apple and Google, which oversee the app stores, should take on the task of age verification prior to app downloads.
Predictably, Apple and Google disagree. They argue that since app developers are in charge of gathering and managing user data, the duty of age verification should rest with those developers.
For parents, the resolution of this argument could have considerable implications regarding who is responsible for safeguarding children online — in addition to the role parents already play.
A New Front in the Age Verification Debate
The Coalition for a Competitive Mobile Experience is spearheaded by antitrust lawyer Brandon Kressin, who has experience working with Match Group. The coalition marks a growing movement among app developers to resist the prevailing influence of Apple and Google within the mobile landscape. Their aim is to campaign for state and federal statutes that would mandate app stores — as opposed to individual applications — to oversee age verification.
At present, U.S. regulations concerning age verification are still in development. To date, 18 states have enacted laws necessitating adult websites to confirm users’ ages. Consequently, some platforms, like Pornhub, have opted to restrict access in those states instead of dealing with the intricate and sensitive requirements of verifying user identities.
Alongside promoting new regulations, the coalition has committed to supporting any antitrust lawsuits aimed at Apple and Google. Numerous developers have long contended that app stores impose unfair competitive limitations and onerous requirements on software creators.
Momentum Builds for Age Verification Laws
Should the coalition’s initiatives be successful, the onus of verifying user ages might transition from individual applications to the app stores themselves. While this could streamline the process for developers, it also introduces fresh concerns regarding data privacy, freedom of expression, and the practicality of enforcing such regulations on a large scale.
The repercussions reach beyond just social media. Companies like OpenAI, for instance, depend on a trust-based system to ensure that minors obtain parental permission before utilizing tools like ChatGPT — a system that is far from reliable. As the demand for stricter controls grows among parents and lawmakers, the pressure on tech firms to establish more effective solutions is intensifying.
Certain states are already initiating measures. In March, Utah became the inaugural state to pass legislation mandating app stores to verify user ages. According to the law, individuals over 18 must present government-issued identification to access specific apps; failure to do so results in denied access. North Carolina is contemplating similar legislation targeting users under 16, although it leaves unanswered questions about who will be responsible for carrying out the verification process.
In the meantime, federal legislators are gearing up to propose laws inspired by Utah’s regulations, indicating that the age verification debate is merely in its infancy.
As the legal and political skirmishes progress, one fact is evident: The issue of who should validate user ages online has transcended technical concerns — it has become a pivotal challenge for the future of digital safety and accountability.