Skip top navigation Skip to main content

Discover Lasell

The Social Media Addiction Debate: An Op/Ed by Assistant Professor Kurt Wirth

March 04, 2026

Kurt Wirth is an Assistant Professor of Communications at Lasell University. 

The Addiction Debate Is a Distraction. The Real Issue Is Who Controls Reality.

When Instagram’s chief recently said he doesn’t believe people can be “clinically addicted” to social media, the comment predictably ignited backlash. But the debate over whether social media meets DSM criteria for addiction misses the point entirely.

To determine whether something is “addictive,” we have to define addiction. Using precedents in the DSM, a case could certainly be made. Behavioral dependencies already exist in recognized forms - gambling, for example. But even if social media were formally categorized as addictive tomorrow, that alone would not resolve the deeper question.

The real issue isn’t whether social media is addictive. It’s whether the systems that shape what billions of people see, believe, and understand about the world should operate without meaningful public oversight.

Media has always been engineered to capture attention. Newspapers refined headlines. Radio perfected pacing. Television mastered cliffhangers. Websites optimized “time on site.” Engagement is not an accident - it is the business model. Algorithmic personalization is not some sinister aberration; it is the logical culmination of an entertainment economy built around attention.
And in a profit-driven system, companies are doing exactly what they are incentivized to do.

That’s just economics.

The problem is that we never updated media regulation for algorithmic systems. We treated social media platforms as if they were neutral bulletin boards rather than powerful, dynamic information filters. Section 230 of the Communications Decency Act provided sweeping protections that made sense in a world of static message boards. It makes far less sense in a world where proprietary algorithms determine visibility at global scale.

Today, a small number of private companies control the architecture of attention for billions of people. Their systems decide which news stories trend, which political movements gain momentum, which creators rise, and which ideas fade. These systems are optimized for engagement - not truth, not civic health, not democratic stability.

That centralization of power and influence is historically unprecedented.

We can argue endlessly about whether scrolling behavior constitutes addiction. Meanwhile, the far more consequential reality remains largely untouched: public discourse is mediated through opaque, privately owned algorithms that the public cannot see, study, or meaningfully audit.

In my own classroom, I routinely ask students about their social media habits. Increasingly, they tell me they believe social media is unhealthy for them. They think they should use it less. And yet they don’t. They feel tethered to it - socially, culturally, professionally. This effect is less about willpower than it is about network effects, loss of agency, and systems designed to keep them engaged.

Children are clearly part of this conversation. There is little doubt that social media influences - and in many cases harms - young users. Anxiety, sleep disruption, body image pressures, and exposure to harmful content are real concerns. We should take short-term steps to mitigate those harms. Warning systems for excessive use, clearer parental tools, and age-sensitive defaults are reasonable starting points.

But focusing exclusively on children risks missing the structural problem.

The issue isn’t that social media is uniquely “evil”. It is that we have allowed the most powerful information distribution systems in human history to operate without the transparency standards we apply to other institutions that shape public life.

We regulate broadcast media. We regulate gambling. We regulate pharmaceuticals. We regulate tobacco. We do so not because these products are inherently immoral, but because they influence public health and public stability at scale.

Algorithmic content distribution now belongs in that category.

If algorithms determine what information is amplified, then those algorithms should not remain private trade secrets. We have, in other industries, decided that certain secrets no longer serve the public interest. Transparency is not radical; it is the price of power.

The solution(s)?

Content-ranking algorithms should be registered with regulators and made publicly accessible. Algorithmic feeds should be opt-in, not default. Chronological feeds should always be available and default. And anonymized platform data should be made freely accessible to independent researchers so the public can evaluate the societal impact of these systems.

It’s long since been time to regulate social media the way we have so many other industries before it.

It does not prevent companies from optimizing for engagement. It simply ensures that the public can see and study the systems shaping public discourse.

Right now, information is power - and that power is concentrated. In a democracy, that concentration demands scrutiny.

The addiction debate will continue. Lawsuits will move forward. Companies will defend their design choices as efforts to maximize user satisfaction. And they are correct - that is what they are doing.

But satisfaction optimization at planetary scale is not a neutral act. It shapes culture. It shapes politics. It shapes reality.

If algorithms now function as the architecture of public discourse, they deserve the same scrutiny as any other institution that shapes democracy.