Now Reading
When harm is engineered
Dark Light

When harm is engineered

Eleanor Pinugu

There was a time when speaking up against the harmful effects of cigarettes was dismissed as alarmist. In the early 20th century, smoking was a ubiquitous habit heavily reinforced by the media. TV shows, movies, and books (even those for children) commonly featured characters lighting up. When research started to come out in the 1940s linking smoking to cancer, the tobacco industry responded by publishing their own studies that cigarettes were safe. One infamous advertisement from this time showed doctors endorsing Camel as their preferred cigarette brand.

As evidence accumulated, smoking was eventually recognized as a significant health hazard, leading to various lawsuits and shifts in behavior. Although cigarettes are still legally available, companies are now required to include warning labels and graphic health warnings on packaging to communicate their risks. The term “big tobacco moment” now refers to a turning point where a major industry faces legal, regulatory, or public liability for knowingly causing public harm.

A recent ruling by a Los Angeles jury to hold Meta and YouTube liable for designing addictive platforms for children is being labeled by various advocacy groups as a potential “big tobacco moment” for the social media industry. The case centers on a young woman named Kaley, who testified that she had become addicted to YouTube at 6 years old, and to Instagram when she was 9. By age 10, she became depressed and started self-harming. At 13, she was diagnosed with body dysmorphic disorder and social phobia.

Her lawyers assert that the companies intentionally developed features designed to keep users engaged and make the platforms addictive, such as infinite scroll, autoplay, algorithmic recommendations, and beauty filters. By identifying these features as “design defects,” the court recognized that these platforms are not passive environments, but systems meticulously engineered to shape behavior and, in some cases, cultivate dependency. This enables us to now ask whether harm was foreseeable, whether safer alternatives existed, and whether these social media companies failed to act despite knowing the risks.

This reframing arrives at a moment when there is growing evidence linking digital environments to youth mental health concerns. Recent research on the impact of problematic smartphone use on young people’s brain function suggests that it heightens their sensitivity to social and emotional stimuli while weakening the brain systems responsible for self-control, reflection, and regulation. In effect, young users may feel more intensely and become more dependent on social feedback (likes and validation) while becoming less equipped to manage those emotions. This imbalance makes them more vulnerable to mental health challenges.

This concern was echoed in the 2026 World Happiness Report. Drawing on surveys of young people, parents, teachers, and clinicians, as well as corporate documents and existing studies, the researchers concluded that there is “overwhelming evidence of severe and widespread direct harms.” Apart from addiction, they cited cyberbullying, sexual harassment, and sextortion, alongside “compelling evidence of troubling indirect harms,” such as anxiety, depression, and sleep deprivation. The report argued that the risks may no longer be causing harm at an individual level, but across entire populations.

Critics will argue, as the companies themselves have, that rising mental health challenges among young people are too complex to be attributed to a single cause. While this is true, this does not absolve social media platforms from responsibility. When billion-dollar companies invest in behavioral engineering to maximize time spent and reward constant engagement, the asymmetry between their tactics and the users’, especially young people’s, self-control is clear. The world embraced social media before its potential harms to our well-being were understood. Now that more and more evidence is coming to light, companies must not only rethink the kind of harmful content they enable, but also the lack of accountability in how their products are currently designed.

Perhaps the most immediate consequence lies in the more than 2,000 similar cases waiting to be tried, as well as the increased interest among governments to regulate social media platforms and online use of young people more effectively. The deeper shift that needs to happen, however, is cultural. Will we continue to celebrate growth, engagement, and virality as markers of success in the digital economy?

See Also

Social media will remain a central part of modern life, but it must be understood and treated as a set of choices that carry consequences. If we now know that spending excessive time online can impact our self-esteem, well-being, and quality of life, then this should prompt a more deliberate reconsideration of how we engage with these platforms and how we hold the companies behind them accountable.

If social media platforms are strategically designed to shape behavior, then their impact cannot be dismissed as incidental. And if harm can be engineered, then responsibility cannot be ignored.

—————-

eleanor@shetalksasia.com

Have problems with your subscription? Contact us via
Email: plus@inquirer.net, subscription@inquirer.net
Landline: (02) 8896-6000
SMS/Viber: 0908-8966000, 0919-0838000

© 2025 Inquirer Interactive, Inc.
All Rights Reserved.

Scroll To Top