Late last year, Australia became the first country in the world to ban children under 16 from accessing social media platforms. This law imposes user age guarantee requirements on platforms. The British government pledged in January to consult on its own ban. We have now started consultations (76 pages/830KB PDF) on this issue. The organization is asking stakeholders whether they support legal requirements to set a minimum age for accessing social media services, and is seeking feedback on whether the measures should apply to children under 16 or younger.
However, alternative interventions are also being considered. This includes limiting daily usage time for individual apps, restricting nighttime access to individual apps, and imposing age-related restrictions. This includes restrictions on the use of certain features, such as those with personalized algorithms, those that allow children to make purchases within the Services, and those that encourage extended use of the Services.
Read more about children’s online safety from Pinsent Masons
The government is also seeking views on whether the age of digital consent under the General Data Protection Regulation (GDPR) should be raised from the current 13 years, through a consultation that ends on 26 May 2026. The age of digital consent is 14 years old in Italy and Spain, 15 years old in France, and 16 years old in Germany and Ireland.
The government said it would act quickly “based on the evidence gathered” during the consultation. The response is expected this summer.
Lauro Fava from Pinsent Masons welcomed the fact that the government is considering different options rather than focusing solely on a ban.
“Many charities oppose the ban because they recognize it is not in the best interests of children, a concept recognized by international law,” Faba said.
“Proponents of the ban are perhaps over-emphasizing the rights to privacy and security, downplaying the many benefits that access to social media brings to children, particularly their learning and creativity abilities, as well as other relevant rights operating under international law, such as children’s rights to play, leisure time and freedom of expression,” he said.
He said: “Implementing the ban comes with challenges, such as how to effectively address the risks of avoidance, how to address the risks associated with young people accessing digital environments they have no previous experience of at night, and how to ensure that vulnerable children have access to support services.”
“More targeted interventions into specific features and functions of social media platforms, such as infinite scrolling, autoplay, and personalized algorithms, would be a more appropriate response and would build on the existing obligations platforms face under the UK framework. For example, under the UK GDPR, platforms must comply with privacy-by-design, fair processing and profile protection. “While there are obligations around children spending too much time on social media, platforms may also be expected to act in accordance with their children’s risk assessments under the Online Safety Act and to avoid harmful uses of data flagged under the Information Commissioner’s Office’s Children’s Code,” Fava said.
“Many platforms have already found direction on the issue of ‘stickiness’ and are already making it easy for children and parents to limit screen time, despite the lack of conclusive research into the impact on children’s mental health. Platforms have an opportunity to urge the UK Government to ensure that the measures already taken in the EU under the Digital Services Act and further regulatory interventions are followed to allow for the consistency of technological measures that need to be rolled out across borders,” he said.
“Importantly, children will be given a say in what they think about future bans and restrictions. While some parents may welcome bans as a solution to the problem of keeping up with rapidly changing technology and practices, children will have their own ideas about what is in their ‘best interests’ in terms of balancing the benefits of accessing services with the perceived potential harms that may result from this,” Faba added.


