ARTICLE AD BOX

Summary
Social media’s addictive design isn’t an accident, it seems, but a business model. Just as Nader’s US stir made all cars safer, we need safety rules for online platforms. Surely, we can protect vulnerable users without slowing innovation in the digital economy.
In Indian boardrooms, ‘AI-enabled’ could overtake ‘digital transformation’ in its buzz quotient. If it’s new, app-based and comes with a valuation expressible in unicorns, it must be good, right?
Well, India has always had a muscular pro-innovation bias. From UPI to ONDC, from Aadhaar to account aggregators, it has shown the world that technology can leapfrog infrastructure.
But history reminds us that progress occasionally has unintended consequences. The byproduct of a genuinely useful innovation can be harmful. Worse still, creators of successful digital technologies tend to gain an outsized influence over social behaviour. If a business’s market cap rivals a state budget, it can subtly script culture.
In 1965, in the US, a young lawyer named Ralph Nader wrote Unsafe at Any Speed, accusing automakers of prioritizing style and profits over passenger safety. The outrage that followed led to what became the National Highway Traffic Safety Administration. Seat-belts, air-bags and crash standards became industry norms and death rates dropped.
Even those allergic to regulation grudgingly admitted that not flying through a windshield was a net positive for shareholder value.
Fast-forward to 2026. Replace tailfins with touchscreens, horsepower with engagement metrics, crash fatalities with spiraling teen anxiety. We may well be at an ‘unsafe at any age’ moment for social media and tech platforms.
In Los Angeles, major addiction trials have begun, with over 1,600 plaintiffs alleging that social-media firms deliberately engineered addictive features that expose children to harm.
The legal strategy is clever. Rather than citing content, always shielded by Section 230 in the US, the plaintiffs argue defective product design. The harm, they claim, flows from infinite scroll, auto-play, algorithmic amplification and relentless notifications designed to hijack attention.
India has over 800 million internet users, most of whom access it via smartphones. The median age hovers around 28. Demographically, India is the world’s largest behavioural experiment in pocket-sized dopamine delivery systems. When infinite scroll meets a teenager preparing for board exams, or an 11-year-old in Kota juggling coaching classes, the stakes are not theoretical.
Consider the architecture of choice. Behavioural scientist Eric Johnson, in The Elements of Choice, argues that choices are architected. Defaults matter. Friction matters. Presentation shapes decision. A ‘choice architect’ determines behaviour without issuing a command.
This is why buying subscriptions is effortless but cancelling payments is a maze. Why default settings overwhelmingly favour the company that sets them. Google doesn’t pay Apple billions to remain the iPhone’s default search engine to buy love. It buys inertia. In the digital economy, it’s a revenue model.
Tech giants get it. Infinite scroll eliminates stopping cues. Auto-play abolishes pause. Notifications interrupt family dinners with the urgency of a calamity. Feeder engines learn your preferences faster than your parents ever did. For adolescent minds, this is not neutral design. It is industrial-scale temptation.
Psychologist Jonathan Haidt, author of The Anxious Generation, has linked rising mental distress among young people to the smartphone era, sparking movements to restrict access for younger teens. In India, educational and parental conversation is growing around age-appropriate screen-time limits and gizmo bans. The debate is whether the digital playground has been designed without safety nets, not whether children should ‘play outside.’
Indian regulators are not strangers to bold digital policy. The Aadhaar rollout reshaped identity verification, UPI rewired payments and now a data protection law seeks to define privacy norms. Will India apply similar strategic clarity to the design of addictive digital environments now that the viral-reel debate has moved to what makes them irresistible and the motive behind it?
Regulating choice architecture is not anti-innovation, but pro-resilience. Automakers didn’t collapse when seat belts became mandatory. Safety regulation did not place any carmaker at a competitive disadvantage. Likewise, if all platforms are asked to implement clear stopping cues, opt-in notifications for minors, algorithm design disclosures and age-gates to keep out the underaged, innovation would continue. Exploitative success may get difficult, though.
India Inc need not wait for litigation to conduct a digital design audit. Map revenue dependence on cognitive vulnerability. Separate adolescent engagement from adult monetization.
Try ‘healthy defaults’ over addictive ones. Create ‘digital airbags’ before Parliament mandates crash tests for algorithms. The country’s policy focus must shift from content policing to design standards. The battlefield is not the meme but the mechanism. Rules on addictive features would level the playing field, protect minors and preserve competition.
Regulation need not slow innovation. But every engine needs brakes. As India focuses more on its digital economy, it should opt for platforms that amplify human potential and minimize human vulnerability.
The social media trial in the US may be America’s Ralph Nader moment. India would be wise not to wait for its own ‘Unsafe at any scroll’ headline. Seat-belts did not kill motoring. They made it survivable. It is time we demand the same maturity of the technologies shaping the minds of our youngest citizens.
The authors are, respectively, professor at Columbia Business School and founder of Valize; and co-founder of the non-profit Medici Institute for Innovation. X: @MuneerMuh

1 hour ago
1





English (US) ·