Expert Opinions: The Impact Of Lowered Age Restrictions On Meta

Meta has recently lowered their age restriction on WhatsApp from 16 to 13, and this has been faced with criticism from social media experts, parents and organisations. The update reads, “You must be at least 13 years old (or such greater age required in your country) to register for and use WhatsApp.” An campaign that began in the UK for the safety of children when it comes to smartphone use, Smartphone Free Childhood, has been one of the campaigners who commented on this change.

Daisy Greenwell, the co-founder commented, “WhatsApp are putting shareholder profits first and children’s safety second. Reducing their age of use from 16 to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike. This policy boosts their user figures and maximises shareholder profits at the expense of children’s safety.”

Their official Instagram post on the matter added, “Officially allowing anyone over the age of 12 to use their platform (the minimum age was 16 before today) sends a message that it’s safe for children. But teachers, parents and experts tell a very different story.

“As a community we’re fed up with the tech giants putting their shareholder profits before protecting our children. And we won’t let them get away with it for much longer.”


What Do Other Experts Have To Say?


We have asked experts to comment on the consequences or positives to this age restriction change. These comments are an addition to the ongoing discussions on how online child safety is being addressed in the UK and EU, as well as around the world.


Our Experts:


  • Tom Stone, Co-founder, re:act
  • Hayley Jones, Director of Social, The PHA Group
  • Lauren Hendry Parsons, Privacy Advocate, ExpressVPN
  • Chelsea Hopkins, Social Media and PR Manager, Fasthosts
  • Paul Bischoff, Consumer Privacy Advocate, Comparitech


Tom Stone, Co-founder, re:act



“Lowering age restrictions on social media platforms raises concerns regarding the adequacy of safeguarding measures to protect younger users. The primary focus revolves around ensuring platforms are equipped to handle this shift in demographic effectively. For such fast-moving platforms where algorithms change on a daily basis, parents are going to want to know that the processes are properly set up to deal with this movement.

“Failure to address these concerns could lead to criticism directed not only at the platforms, but also at brands whose advertisements might bypass safety measures. In a world where consumers make up their mind 0.3 seconds after interacting with a brand online, this sort of quickly driven negativity due to incorrect safeguarding measures could be costly for all parties.

“Consequently, there’s a pressing need for platforms to prioritise the development and implementation of comprehensive safeguards to mitigate potential risks associated with younger users accessing their services.”


Hayley Jones, Director of Social, The PHA Group



“Meta’s decision to lower the age restriction on their platforms from 16 to 13 is a significant development and one that requires careful consideration. While it undoubtedly opens up access to a broader audience, it also raises important considerations regarding online safety in a space that may expose young children to trolling and other harmful experiences.

“On the flip side, lowering the age restriction has the potential to empower younger users to engage with social media in more meaningful ways, fostering creativity, connectivity, and learning opportunities. Gen Zers, known as the ‘chronically online generation’, are known for working, shopping and dating online significantly more than previous generations. This places even greater responsibility on Meta to ensure robust safety measures are in place to protect these vulnerable and frequent users from harmful content, cyberbullying, and exploitation.

“This decision must be underpinned by a wider education strategy and the importance of digital literacy both in schools and at home. Younger users need guidance on navigating the complexities of the online world, understanding privacy settings, recognising misinformation, and developing healthy digital habits.

“As professionals in the social media industry, it’s our duty to advocate for policies and practices that prioritise the well-being of all users, especially as platforms evolve and demographics shift. By striking a balance between accessibility and safety, we can help create a more inclusive and responsible digital landscape for generations to come.”



Lauren Hendry Parsons, Privacy Advocate, ExpressVPN



“With WhatsApp lowering the minimum age for the app in the UK from 16 to 13, I support the calls for Meta to reconsider this “tone-deaf” decision – and show what protections it is putting in place to manage the potential negative effects of allowing younger people into the Meta ecosystem.

“Lowering the minimum age of social media and messaging platforms puts vulnerable children at risk of online harm. Research from ExpressVPN found that exposure to inappropriate content is one of the top disturbances for children online. From graphic, violent, and disturbing imagery to sexually suggestive material, access to the internet and messaging platforms can expose children to content that can be deeply unsettling and harmful.

“It is important for parents to take appropriate steps to keep them as safe online as possible. The number one tool in your arsenal is education, together with a strong foundation of trust. Education is vital; whether it’s on how they can be a responsible digital citizen, why it’s important to be open and honest about what they see or experience on social media or messaging platforms, and how they can use the internet safely.”


Chelsea Hopkins, Social Media and PR Manager, Fasthosts



“The move from Meta to lower the age restriction on Whatsapp is definitely a bold one, however it’s one that’s in line with other social media platforms like X and TikTok which also require users to be aged 13. The problem is this move comes at a time where people are calling for the age restrictions on these apps to be increased not decreased, unsurprisingly causing a bit of an outcry especially from child protection groups.

“X in particular has become a hotbed of controversy due to perceived poor moderation and worries it’s exposing children to graphic content, with protests surrounding that issue now spilling over into Meta’s decision and further fuelling the fire.

“The decision to lower the age restrictions effectively boils down to the company wanting more users on the platform, and while Meta are continuously pushing that they have child safety as their top priority, many aren’t convinced. We’re likely to see a lot of pushback from concerned parents calling for reversals of the decisions and/or stronger parental controls put in place, as well as causing the debate around social media age restrictions as a whole to heat up.

“However, some parents may actually be glad of the change, as Whatsapp would be the platform of choice for contacting their children over other, similar messaging apps like Facebook Messenger that are tied to accounts that they won’t want their children using.”


Paul Bischoff, Consumer Privacy Advocate, Comparitech




“Lowering the age from 16 to 13 brings restrictions in line with existing child protection laws like COPPA, which are designed to protect children aged 13 and under. I also surmise that children aged 13 and under are less likely to willingly bypass age verification measures.

“Age verification on the internet is a longstanding problem for which no good solution currently exists. Verifying someone’s age requires verifying their identity, which has serious privacy implications and forces private companies to be legal gatekeepers. It’s especially difficult with kids, who often don’t have state-issued IDs.

“Social networks would rather do the minimum amount of due diligence and allow kids to lie about their age. Meta is lobbying hard to force app stores to be legally responsible for age verification, shifting the onus of responsibility from itself to Apple and Google.”