A Safer Internet for All – how can it be done?
Skip to main content
Insight

A Safer Internet for All – how can it be done?

A neon blue shield hologram stands out against a dark background with floating binary code (1s and 0s) in purple hues. The image conveys themes of cybersecurity, data protection, and digital security.

This year, Safer Internet Day celebrates its 21st anniversary. The objective of this day is to promote a safer and more secure online experience for every user, especially for children and young people. How the digital landscape has changed during this time, with respect to connectivity, from dial up internet to prevalent Wi-Fi and the launch of today's ubiquitous smartphone, needs little introduction. 

Technology's rate of change and accessibility has been exponential, whilst its agility and ability to disrupt markets and leading operators has been extraordinary.  Time and again new start ups have annihilated long standing traditions.

The positives of technology are staggering and are something that everyone can benefit from.  Yet, the negative aspects of the technical revolution have at times tipped the balance and seen some damning headlines around the world.  The Anxious Generation by Jonathan Haidt, which has sat in the New York Times best seller list for the last 45 weeks is critical about the impact of smartphones, social media and big tech on children's mental health.

Legislation has faced extreme challenges in keeping apace but as we adapt to an age of generative AI, has the law finally been able to meet that test, future proof itself and be able to sustain a safer online environment for all?

Don't miss a thing, subscribe today!

Stay up to date by subscribing to the latest Data and Privacy insights from the experts at Fieldfisher.

Subscribe now

One of the major difficulties for the law in regulating technology, whether hardware or software, is technology's global, pervasive nature with the leading tech developers often headquartered in the US or Asia.  In contrast, laws have borders and fixed jurisdictions, which are often characterised by cultural standards.  The GDPR (General Data Protection Regulation (EU) 2016/679) caused quite a stir with its extraterritorial effect on those targeting citizens in the EU from outside the region.  This effect means that those businesses are required to comply with the GDPR when operating in the UK and EU besides providing data subjects with protection of their personal data that is essentially equivalent when their data is transferred outside of the region.  The regulation itself has gone on to influence copycat data protection standards around the world.

One striking aspect of the GDPR was that it gave children "specific protection" for the first time (despite the United Nations Convention on the Rights of the Child (UNCRC) being first adopted in 1989).  This explicit reference has put children front and centre of the latest guidance, statutory codes and legislation tabled since.  The UK Information Commissioner's Office's (ICO) Age-Appropriate Design Code (AADC) was "the first of its kind" and has been replicated across EU Member States and is influencing change further afield.

Online Safety Legislation

In recent years, the legal momentum has rapidly increased to focus on digital regulation especially online safety.  In tandem with its digital strategy, the EU has brought in the Digital Services Act to create a safer and more transparent online experience whilst the UK has passed the Online Safety Act 2023 designed "to make the UK the safest place in the world to be online".

Both Acts address illegal content, hate speech and misinformation whilst giving online platforms greater obligations to take down and be responsible for illegal content.  Users are empowered to report inappropriate content more easily and platforms are obligated to examine and mitigate the risks their services pose as well as be more transparent about their recommender systems and content moderation decisions.  The Acts focus in particular on protecting users under the age of 18.  This is distinct from the digital age of consent in the GDPR which ranges from 13 to 16 across the UK and Member States due to the derogation provided which allowed Member States to decide the age of consent for their country between these two ages.

The DSA became fully applicable on 17 February 2024, and companies operating in the EU are demonstrating a priority with respect to the safety of their services and are creating age appropriate systems for their different age range of users. Formal proceedings have been opened by the European Commission against half a dozen very large online platforms with investigations largely focused on the design functions of platforms in addition to the way in which access is given to researchers. One of the Commission's first formal proceedings led to TikTok deciding to suspend its "Task and Reward Programme" of TikTok Lite in France and Spain and to pause the role out in other EU Member States.

Ofcom, the regulator for the UK's OSA is describing 2025 as its year of implementation and action.  Since the passing of the OSA in late 2023, Ofcom has been undertaking a variety of public consultations, yet with the recent publication of its respective statements on i, Protecting people from illegal harms online; and ii, Age Assurance and Children's Access, services operating in the UK are beginning to see a timeframe by when compliance actions need to be performed.

 

Ofcom document

Obligation(s)

Relevant date

Statement: Protecting people from illegal harms online

Perform a risk assessment of the risks of illegal harms on your service.

16 March 2025

Illegal Harms Codes and guidance

Take the safety measures set out in the Codes.

Ongoing from 17 March 2025 (subject to completion of the Parliamentary process)

Statement: Age Assurance and Children’s Access

All user-to-user and search services in scope must carry out a children’s access assessment to determine if they are likely to be accessed by children. 

16 April 2025

Statement: Age Assurance and Children’s Access

All services that allow pornography must implement highly effective age assurance to ensure children are not normally able to access pornographic content

End of July 2025

Protection of Children Codes and guidance (expected April 2025)

Services will be required to conduct children's risk assessments, implement highly effective age assurance measures and be transparent and accountable about their services.

Complete children's risk assessment - July 2025

Child protection safety duties to become enforceable – July 2025

Age Assurance

To date, children have often been able to access inappropriate content and services due to the implementation of ineffective age assurance methods.  At present there is no one age assurance model that is 100% guaranteed to identify the age of a child under 18.  It is a matter of adopting a combination of methods that are appropriate for the risk that your service poses to the user.  Ofcom, in its Statement on Age Assurance provides examples of methods which it considers capable of being highly effective and methods which are not capable.  It is a non exhaustive list and to determine a highly effective age assurance method Ofcom provides four criteria and practical steps in how to fulfil them.  There are two additional principles to comply with as well - accessibility and interoperability which require that the method is easy to use and works for all users.

In setting out the procedure for its highly effective age assurance methods, Ofcom has worked with the ICO given the overlap between the AADC and the OSA.  Both regimes complement each other and online services that are compliant with the AADC will have a significant opportunity to leverage and align their compliance efforts. We will provide more information on how to do this in the near future.  Increasingly, the interplay between legislation is becoming apparent and each piece cannot be looked at in isolation.  In this area, AI legislation and best practice also needs to be considered, with civil society and academics extremely active in calling for children to be given particular attention in the development of AI models.

Global adoption

The EU and UK are not alone in the global push to improve safety for all with other safety laws enacted in Australia, Singapore and Sri Lanka.  States across the US are introducing various bills to increase the protection of children online.  Considering this common objective around the world, there has been a Global Online Safety Regulators Network established which enables regulators to share information and best practices whilst drawing on each other's expertise.    

Practical considerations

Operationally, the nuances of different regulatory regimes cause problems for companies with a global presence who need to determine a strategy and decide whether to adopt the highest compliance level for all or alternatively tailor their compliance by jurisdiction.  Without doubt, governments and regulators are prioritising this area and it is recommended that global online services continue to monitor what is happening in the markets where they are operating as they meet their existing and imminent obligations.

Given the breadth of services online and the variety of international users, any solution to improve online safety is multifaceted and requires responsibility from all stakeholders.  Technology providers have an important role to play but online safety legislation itself is not a panacea.  Education has an important role to play in equipping children with media literacy and critical thinking yet, despite the increased use of technology in schools, there is little on the curriculum about how to use it. Parents and child users themselves equally have their part to play and increasingly online services are involving children in their design to improve their experience.

This Safer Internet Day the signs are good that one's experience online can improve for the better, but it requires a team effort and for everyone to play their part. Fieldfisher's upcoming Q&A Panel event with Ofcom: The Online Safety Act: Key issues, practical tips and next steps, and blogs will aim to help online services navigate these challenges and opportunities.

Fieldfisher actively supports pro bono initiatives in this area and Lorna Cropper is a member of the NSPCC (National Society for the Prevention of Cruelty to Children) Online Safety Taskforce.

Areas of Expertise

Data and Privacy