Social Media Health Warnings
In a trial covered by Platformer, KGM’s lawyers have likened social media platforms to a “digital casino,” offering visitors irregular dopamine hits via infinite scroll, autoplay videos, beauty filters, and algorithmic recommendations. Slot machine-like features, they called them. If anything, the comparison is generous to the casinos, at least a casino has an exit.
I’ve been writing about the deliberate design of these platforms for years now, and the language has always been the sticking point. We talk about social media like it’s weather, something that just happened to us, and the platforms love that framing. They present problems as external challenges they’re working hard to address, when the reality is that engagement-maximising features aren’t bugs. The infinite scroll isn’t broken, it’s working exactly as intended.
The trial coverage mirrors the tobacco playbook. The comparison to cigarettes has been made before, and it’s apt, but there’s one area where we’ve been slower to act. Tobacco companies were eventually forced to slap health warnings on their products, graphic images of diseased lungs on the packet, stark text telling you this thing will kill you. It took decades of legal battles and mountains of research, but it happened. Why aren’t we talking about the same thing for social media?
I don’t mean some buried disclaimer in a terms of service document that nobody reads. I mean actual, visible, unavoidable warnings. Open Instagram and get a clear message: this app is designed to keep you scrolling longer than you intend, this content is selected by an algorithm optimised for engagement not your wellbeing, your usage data suggests you’ve spent three hours here today. None of that is radical, it’s just the truth presented without the corporate spin.
The obvious pushback is that social media isn’t the same as cigarettes, that people get genuine value from these platforms through connections with friends, community groups, and small businesses reaching customers. That’s all true, but cigarettes gave people genuine value too. Stress relief, social bonding, a reason to step outside and talk to strangers. The value of the product was never the question. The question was whether the companies manufacturing it had a responsibility to be honest about what it was doing to people, and the answer turned out to be yes.
Meta’s own research showed Instagram made teen girls feel worse about their bodies. That isn’t speculation or moral panic, it’s their own data, which they tried to bury. The algorithms are designed to maximise engagement at the expense of everything else, and the companies behind them have zero incentive to change without external pressure. They’ll tinker around the edges, announce new “wellbeing features,” and hire more PR staff, but the core product remains a machine built to exploit your attention for advertising revenue.
A health warning wouldn’t fix everything, it wouldn’t redesign the algorithm or break the advertising model. However it would force these companies to publicly acknowledge what their products do, and shift the conversation from “are social media platforms harmful?” to “how harmful are they, and what are we going to do about it?” That’s a very different starting position.
Unfortunately the lobbying power of these companies makes Big Tobacco look like amateurs, and there’s a whole generation of politicians who still don’t understand what an algorithm is, let alone how to regulate one. But social media being compared to casinos in a courtroom feels like something has changed, and if the infinite scroll really is going on trial, maybe the verdict will eventually include a warning label.