Should Addictive Social Media Algorithms Still Be Legal?

Almost half of UK teenagers say they feel addicted to social media, this comes from a long term study from the University of Cambridge. The research tracked about 19,000 people born in the early 2000s, with 7,000 giving answers on their digital use. Of those, 48% agreed with the idea they were addicted. Girls reported higher levels, with 57% saying they felt hooked, compared to 37% of boys.

Researchers clarified that these self-reports are not the same as medical diagnoses. Georgia Turner, who led the analysis, said that feeling out of control is enough to be troubling. Many of those surveyed felt they did not have agency over their use of apps, which suggests a problematic relationship with social media.

Globally, stats from consumer insights platform GWI show that 27% of adults worry they spend too much time scrolling. 1 in 10 say they scroll for 3 to 4 hours daily, while 4% spend more than 10 hours on apps. Gen Z leads the pack when it comes to digital detoxing, with 34% trying to cut screen time this year.

 

What Makes It Addictive?

 

Social media platforms do not rely on chance to keep people hooked. They use algorithms that decide what appears in a person’s feed. In earlier years, the most popular posts gained visibility through likes or shares. Today, algorithms build a personalised feed from each person’s online behaviour.

These systems monitor which videos are watched, which posts are liked and which adverts draw clicks. Based on this data, they predict what content is most likely to hold attention. This has made the apps more immersive and raised the risk of people forming addictive habits tied to specific platforms such as Instagram or TikTok.

The US surgeon general has even issued a public health advisory on the risks to young people’s mental health. Algorithms are designed to keep eyes on screens for longer, which feeds directly into this debate about whether social media can be classed as addictive.

 

Who Feels The Effects Most?

 

Teenage girls are especially vulnerable, according to the Cambridge research. Their need for social validation and their impulsive behaviour were identified as the main reasons for this. Researchers also said that children’s developing brains make them more prone to harmful self-comparison when exposed to filtered images and videos.

In the UK, 62% of 13 to 15 year olds use TikTok every day, which is a 12% increase from 2022. Among adults, 34% now use the app, and 2/3 of them log in daily. Such high use, paired with feelings of lost control, has made social media addiction a growing topic of study.

 

Should These Algorithms Be Legal?

 

Experts have shared their thoughts on how the legalities around these addictive algorithms should proceed, now that it is acknowledged how they impact the brain…

 

Our Experts:

 

  • Sebastian Ellis, Managing Director, Ellis Digital
  • Christopher Migliaccio, Attorney & Founder, Warren & Migliaccio, L.L.P.
  • Adrienne Uthe, Founder & Strategic Advisor, Kronus Communications
  • Leah Jacobs, Founder and Director, Digital Wellness Project
  • Ankit Gupta, Senior Security Engineer, Exeter Finance
  • Ben Michael, Attorney, M & A Criminal Defense Attorney
  • Matt Hall, Online Visibility Strategist, Host, Guiness World Record breaking podcast, Success School
  • Phil Parkinson, Creator, Know Thyself, The Game of Self-Discovery
  • Megan Dooley, Head of Organic Social, TAL Agency
  • Jules Brim, Social Media Expert and Business Consultant
  • Sarah Jeffries, Founder, Paediatric First Aid.
  • Jay Yu, Estate Planning Lawyer, Yu and Yu Law.
  • Laura Riley, Clinical Director, Rolling Hills Recovery Center
  • John Young, Principal Consultant, TSG Training.

 

Sebastian Ellis, Managing Director, Ellis Digital

 

 

Social media algorithms are not neutral tools. They are carefully designed to capture and hold attention for as long as possible. They feed users an endless stream of personalised and engaging content. The platforms encourage what has become known as doom scrolling, this constant drip of stimulation taps into the same reward circuits as gambling which is why so many people struggle to put their phones down and walk around the house like Zombies!

“The effect is not simply wasted time, its worse and what many describe as brain rot where attention spans shorten, sleep is disrupted and people report feeling more anxious and less able to focus.

“A clear sign of how far this has gone is that now content creators insert other videos into their own videos just to keep viewers engaged because even seconds of stillness risk losing attention – crazy right!

“Young people in particular are very vulnerable because their brains are still developing and the compulsive design of these systems keeps them coming back in an addictive way. At present there is very little oversight of how these algorithms are run yet their impact on mental wellbeing and on society is increasingly visible. Especially in social media platforms that are not run or managed within the USA.

“Unless we treat them with the same seriousness such as other addictive systems we will potentially see long term harm to users.”

 

Christopher Migliaccio, Attorney & Founder, Warren & Migliaccio, L.L.P.

 

 

“When asked if addictive algorithms should be illegal, I begin with the fact that it’s inherent to our laws to lag behind technology. The First Amendment protects platforms broadly, but if the coding of an algorithm is intended to exploit neurological triggers, while I would think it would also cross the line into product liability and consumer protection, as the algorithm is purposely keeping users scrolling, despite their better judgment, in a court of law, we would probably call it foreseeability of harm, much like the warnings about cigarettes.

“The damage is not hypothetical. I have worked with clients that had teenagers that lost months of school, adults that stopped working altogether, and I also have evidence where algorithm-induced consumption exacerbated obvious mental health conditions. That is addiction by any practical definition, even if the DSM has not stamped it. The legal world has precedent in this area: for years tobacco companies claimed their product was just a consumer choice. Ultimately we recognised the way companies manipulated human vulnerability as actionable.

“I expect that we will start to see litigation that changes the way we think about these algorithms – not as neutral tools, but designed hazards. If juries start to accept that framing, and punish based on that, corporate liability will take on a whole new meaning. Until the politicians figure it out, platforms will hide behind the idea of free speech, and our real question is really whether we value these business that build their models on compulsive design. I suspect we are losing patience.”

 

Adrienne Uthe, Founder & Strategic Advisor, Kronus Communications

 

 

“Social media algorithms, and social media in general, have fundamentally changed the world forever, and the impact is only getting worse. The effect on kids, teenagers, and even adults is astounding. These addictive algorithms are not a healthy or good thing for users.

“I see these addictive algorithms similarly to how I view sponsored content: they need to be disclosed. It is not public knowledge that the largest tech companies have entire departments dedicated to understanding how the human brain works to keep people hooked on these platforms. I believe in a free market, but this level of psychological manipulation must be disclosed so that every user is aware of what they are engaging with.”

 

Leah Jacobs, Founder and Director, Digital Wellness Project

 

 

“The way social media algorithms currently operate points to an urgent need for policy reform. Algorithms are shaping human behaviour, especially among youth, by reinforcing compulsive patterns that present as addictive behaviours. These algorithms are not neutral. They are intentionally designed to overtake and hold attention, offer instant gratification, and create feedback loops that can undermine focus.

“The addictive characteristic of algorithms is a result of its ability to exploit the brain’s dopamine system, feeding users a continuation of personalised content.

“Banning algorithms may not be practical, however, allowing them to operate as is poses a public mental health concern. At the very least, digital media users deserve to use platforms with transparency and ethical oversight. Consent should also be a significant consideration, allowing users the ability to opt in or out of algorithmic content. The current model places profit over psychological well-being, and that has to change.”

 

 

Ankit Gupta, Senior Security Engineer, Exeter Finance

 

 

“Social media algorithms have already reshaped how people consume information and interact with the world around them. By prioritising content that keeps users engaged for longer, these systems are designed to maximise attention. Over time, this has created patterns of behaviour that mirror what we see in addictive cycles. People often find themselves scrolling far longer than they intended, checking apps reflexively, or feeling anxious when they are disconnected.

“Experts generally agree that the impact is significant. Algorithms influence not only the frequency of use but also the emotional states people experience while online, whether it is outrage, validation, or a sense of belonging. In the same way that addictive substances can rewire reward pathways in the brain, algorithm-driven platforms are shaping habits and even identities. The effects are not uniform for everyone, but the trend suggests that algorithmic influence is among the most powerful behavioural forces of our time.”

 

Ben Michael, Attorney, M & A Criminal Defense Attorney

 

 

“I don’t think pretty much everyone would agree that addictive algorithms aren’t exactly a good thing. Most would agree that they can certainly cause harm. However, that doesn’t necessarily mean they should be illegal. Looking at it purely from a legal perspective, there are a variety of issues that immediately come to mind here. A big one being, how would an “addictive algorithm” actually be defined? And, how would you prove it?”

 

Matt Hall, Online Visibility Strategist, Host, Guiness World Record breaking podcast, Success School

 

 

“It’s not as simple as saying that addictive algorithms should no longer be legal in my opinion .From the beginning of time, marketing has always relied on grabbing and keeping people’s attention, whether it’s a bold font on a marketing flyer or a Tiktok designed to stop the scroll-they’re all doing the same thing. This is what gets businesses in front of their audience, and ultimately helps the customer (since the business has a problem to solve).

It’s the wrong approach to think we need to stop something that businesses need to help people, which is what marketing is all about. However, we know that social media addiction can have a huge impact on our mental health, impacting our mood, our decision making and how we feel about ourselves.

Instead, we should focus more on education on an individual level instead of banning the algorithm-by educating everyone on screen usage time and the mental health implications, it’s about equipping individuals with information about too much screentime so they can make informed decisions about their usage.”

 

Phil Parkinson, Creator, Know Thyself, The Game of Self-Discovery

 

 

“These platforms are expertly engineered to keep us hooked, but at what cost? Chronic social media use is linked to heightened anxiety, depression and poor sleep, especially among young people. Far from bringing us all closer together, social media is making us more isolated.

“But what’s brilliant is, we’re starting to see people rebel against the tyranny of algorithms, reclaiming real-world connection and turning away from digital fatigue. Nearly three-quarters of Gen Z say they feel lonely sometimes or always, despite being the most digitally connected generation – and that’s driving a shift back to tangible, shared experiences. The global board game market is growing at almost 10% annually as people rediscover the joy of face-to-face play, and crafts like knitting, pottery, and cooking clubs are booming among young people, with a resurgence in phone-free events as antidotes to screen time. It’s clear that people are hungry for presence, community, and experiences that can’t be replicated by a screen.

“I’ve even designed a board game specifically for the purposes of bringing people together in the real world – away from the addictive pull of social media – because I feel so strongly that it’s actively damaging our mental health and sense of connectedness as a society.

“Ultimately, if we wouldn’t allow a company to sell cigarettes to children because they’re addictive, why are we permitting tech companies to exploit our brains for profit? It’s time for us to prioritise human connection over digital distraction and reclaim our wellbeing.”

 

Megan Dooley, Head of Organic Social, TAL Agency

 

 

“Social media algorithms are no longer just background code. They’ve become key players in shaping how billions of scrollers consume content, think, and even behave once they step away from their screens. By relentlessly optimising people’s feeds purely for engagement – because that’s what keeps us scrolling – these systems create neverending feedback loops that reward outrage, novelty, and emotional intensity over balance, perspective, and genuine thoughts. The result isn’t simply more screen time; it’s rewiring our attention patterns, where constant stimulation by screens starts to mirror addictive behaviours.

“Behavioural experts are becoming increasingly aware of similarities between social algorithms and the psychology behind slot machines and gaming apps; think the variable rewards, the unpredictable hits of validation and dopamine, and the sense of urgency that keeps people hooked. While not everyone will be vulnerable, the scale is undeniable – social media algorithms don’t just serve content, they shape cultural narratives, influence decision-making, and, in some cases, start to negatively impact our wellbeing. The conversation has started to shift from whether algorithms impact people, to the severity of how they already do.”

 

Jules Brim, Social Media Expert and Business Consultant

 

 

Social media algorithms have without doubt shaped the way we live, work, and connect. On the positive side, they’ve enabled small businesses, creators, and individuals to find their audiences and build communities in ways that simply weren’t possible before.

“The intention was always to surface relevant, useful content, but over time, the algorithms have shifted towards maximising time spent on platforms rather than maximising value.

“The result is something closer to an addiction model than a connection model. The constant cycle of scrolling, notifications, and “what’s next?” dopamine hits mirrors the behavioural hooks we see in other forms of addiction. For adults, this often means increased distraction, shorter attention spans, and a deterioration in meaningful engagement. For children and teenagers, the risks are even greater as they are still developing their sense of self and resilience, and algorithms that prioritise engagement at all costs can distort their self-image and mental wellbeing.

“That’s not to say algorithms should be scrapped entirely. When well-managed, they remain powerful tools that help businesses reach the right customers and allow people to find communities of support and belonging. But the balance is currently off and what’s urgently needed is stronger regulation and platform accountability to manage the addictive design features, particularly around younger users, and a shift in emphasis back towards quality connections rather than endless consumption.

“While we now have a whole world available at our fingertips, it’s just as important to prioritise genuine human connections and spend time together in real life.

“In short, algorithms themselves are not the problem, it’s how they’re being used. If we continue to allow addictive models to go unchecked, we risk eroding the very real benefits social media can bring to both personal and professional life.”

 

Sarah Jeffries, Founder, Paediatric First Aid.

 

 

“Children’s brains aren’t fully developed, so they can’t self-regulate the way adults do. When your child is on social media, they’re facing algorithms engineered to hijack attention through endless scrolls, autoplay videos, and personalised notifications. It’s like leaving a plate of sweets in front of your child all day – the system is built to exploit impulse. Every click, swipe, or pause is logged and fed back into the algorithm, which then serves up more of the same. The loop reinforces behaviour, and pulls kids into cycles of compulsive use before they’ve had the chance to build the tools to step away.”

 

Jay Yu, Estate Planning Lawyer, Yu and Yu Law.

 

 

“If an algorithm knowingly exploits addictive patterns, it raises serious legal questions similar to those that tobacco companies faced decades ago. The key issue is whether platforms have a duty of care to prevent harm to their users. Without clear rules, companies can design systems that maximise engagement while externalising the social and psychological costs. Legal intervention could be necessary to define that duty, set boundaries for exploitative design, and ensure that users are protected from harm caused by algorithms engineered to keep them hooked.”

 

Laura Riley, Clinical Director, Rolling Hills Recovery Center

 

 

“The behavioural patterns I see around social media use mirror classic addiction signs. People describe cravings to check feeds, losing control over time spent online, and neglecting other activities – the same symptoms seen in substance abuse and gambling disorders. Algorithms intensify this by engineering a constant cycle of reward and compulsion.

“Every scroll, like, and notification is engineered to trigger another hit of engagement, keeping users hooked longer than they intend. From my perspective, it’s hard to argue these systems are anything less than addictive mechanisms. Treating them otherwise means ignoring the evidence that they function in ways almost identical to recognised forms of addiction.”

 

John Young, Principal Consultant, TSG Training.

 

 

“Algorithms aren’t inherently addictive, but they are optimised for engagement, which often means keeping people scrolling longer than they intended and sometimes affecting their wellbeing. The problem isn’t legality, it’s transparency. Without knowing how these systems are trained and tuned, people are at the mercy of black-box designs that prioritise time on platform over meaningful experience. Smarter regulation could push platforms to create algorithms that focus on meaningful interaction instead of compulsive use, making online spaces healthier without taking away value for users.”