The Personal Responsibility Argument: How Meta Keeps Dodging Regulation—and How to Stop It
Whistleblower reports, causal academic research, and the company’s own internal documents all make clear that Meta is harming our society, and we should probably do something about it. But how have we let this happen?
It turns out there is a specific argument used by many past industries that Meta is heavily relying on: the personal responsibility argument. If we can reject this argument as a society, we can beat Meta.

The company is currently facing a lawsuit because a group of people are exposing its platforms as intentionally engineering addiction. A few weeks ago, when asked to comment, CEO Mark Zuckerberg responded, “If you do something that’s not good for people, maybe they’ll spend more time [on Instagram] short term, but if they’re not happy with it, they’re not going to use it over time. I’m not trying to maximize the amount of time people spend every month.”
At its core, this is the personal responsibility argument. If our products are harmful, users can simply choose to stop using them.
The argument works so well because it appeals to our national tendency to embrace rhetoric promoting autonomy and control—we like to see ourselves as champions of freedom. We tend to think of regulation as the enemy of freedom when, in reality, it can protect it. Without regulation, Meta will be able to continue controlling individuals and society with the immense power they have over our thinking and decision-making.
At its core, this is the personal responsibility argument. If our products are harmful, users can simply choose to stop using them.
The danger of the personal responsibility argument is that it puts too much accountability on the individual. We’ve reached a point where simply choosing not to use Instagram—or avoiding social media addiction altogether—has become extremely difficult. This is why so many of us, myself included, have felt ashamed while struggling against these technologies, as if we are the ones to blame. Meta hires thousands of people to carefully design intentionally addictive products, with our constant attention as their only objective. This is not a fair fight.
Fortunately, the personal responsibility argument is not new, although the stakes might be higher than ever. These types of debates have long been argued, especially within the fast food and tobacco industries. Social media is a new threat, but the tension between society’s responsibility to protect people from harm and give them the autonomy to make their own decisions has always been present. Striking the right balance between these values makes regulatory policies successful.
In her book Unwired, lawyer and author Gaia Bernstein shows us a path forward by turning to a historical example of our journey towards successful regulation—the tobacco industry: “The personal responsibility argument broke down when it turned out smokers were not making autonomous choices and could not be held accountable. Revelations that corporations acted intentionally and covertly to addict their customers undermined the personal responsibility argument.”
If we want to be a country that truly values happiness and freedom, we must regulate the “druggified”1 technologies that intensify certain pleasures while ultimately diminishing our capacity for self-governance. To do this, we must reject the personal responsibility argument and recognize how these technologies are intentionally engineered to undermine the very freedom and happiness they claim to respect. Mandating algorithmic transparency and audits, imposing stricter screening of advertisers before fraudulent ads run, or requiring independent and transparent fact-checking systems would be a good start.
Founded in 2025, Reconnect Stanford is a Stanford student-led movement & non-profit dedicated to helping people step away from addictive platforms and toward meaningful connection, time, and attention. We build community around social media sobriety through stories, support for students who delete, peer-to-peer mentorship with middle and high schoolers, and events where peers disconnect together.
To learn more and support our cause, visit reconnectstanford.org. To submit a guest piece, email reconnectstanford@gmail.com
About the author: Kai Lobell (guest author) is a sophomore at Stanford University studying data science. He is interested in understanding and improving how individuals and societies make decisions.
A term coined by Anna Lembke, MD, a professor and Medical Director of Addiction Medicine at Stanford University School of Medicine



