Online Safety Code launched by Coimisiún na Meán
Skip to main content
Insight

Online Safety Code launched by Coimisiún na Meán

Locations

Ireland

Niamh Hodnett, the Online Safety Commissioner launched the Code by stating "the adoption of the Online Safety Code brings an end to the era of social media self-regulation."

Background

The adoption of the Online Safety Code (the "Code") on 21 October 2024 marks another milestone for Ireland's media regulator, Coimisiún na Meán (the "Commission") and online safety regulation in Ireland. It forms part of Ireland's Online Safety Framework which comprises of the Online Safety and Media Regulation Act ("the 2022 Act"), which amends the Broadcasting Act 2009 ("the 2009 Act"), the EU Digital Services Act ("DSA") and the EU Terrorist content Online Regulation ("TCOR").

On 29 February 2024, the EU Court of Justice fined Ireland €2.5m for failing to transpose the Audiovisual Media Services Directive ("AVMD") in time and Ireland has been made to pay a daily penalty of €10,000 per day for as long as this infringement continues. The adoption of the Code gives effect to Article 28b of the AVMD and means that Ireland has finally fully transposed the AVMD into national law.

The statutory basis for creation of the Code stems from Section 139K of the 2009 Act, as amended by the 2022 Act which states that the Commission may make Online Safety Codes to be applied to designated Video Sharing Platform Services ("VSPS") in accordance with Section 139L of the 2009 Act, as amended by the 2022 Act. The formation of the Code been a key focus of Online Safety Commissioner, Niamh Hodnett since she commenced her role in March 2023.

The adoption of the Code follows a lengthy process of engagement carried out by the Commission in respect of specific evidence gathering activities and consultation processes as required by Section 139M and Section 139 of the 2009 Act, as amended by the 2022 Act. This included issuing a public open call in July 2023 for input in respect of the development of the Code and the harms that should be addressed which received 55 responses from the public and interested stakeholders. The Commission then instructed PA Group in September 2023 to undertake a literature review of available evidence pertaining to online harms on VSPS before instructing IPSOS B&A in November 2023 to undertake two surveys on its behalf pertaining to inter alia the potential harms caused by video content. In December 2023, the Commission published a consultation document in respect of a Draft Code which received 1,398 submissions from the public and relevant stakeholders.

A revised Draft Code was published on 27 May 2024 following consultation with the Commission's Youth Advisory Committee and was then notified to the EU Commission as required by the Technical Regulation Information System Directive ("TRIS") in accordance with EU Directive 2015/1535. The TRIS Directive requires that Member States must notify draft technical regulations concerning information society services to the EU Commission before these regulations can be adopted into national law.  No comments on the Code were received from the European Commission or other EU Member States as part of this process which allowed the Commission to adopt the code.

Who does the Code apply to?

On 16 January 2024, after consultation with the service providers pursuant to Section 139H of the 2009 Act, the Commission published its statutory register of VSPSs to which the Codes may be applied. The ten VSPSs designated by the Commission to date are:

  • Facebook;
  • Instagram;
  • YouTube;
  • Udemy;
  • TikTok;
  • LinkedIn;
  • X (formerly Twitter);
  • Pinterest;
  • Tumblr; and
  • Reddit.

Tumblr and Reddit made separate unsuccessful High Court challenges earlier this year in respect of the Commission's decision to designate them as a VSPS. Following the High Court's rulings, the Commission were entitled to keep both Tumblr and Reddit in their statutory register. You can read more about these challenges and the overall outcome in our previous blog here.

In order for the Code to apply to a designated VSPS, a determination must be made by the Commission and notice provided to the designated VSPS in accordance with Section 139L of the 2009 Act. On 17 October 2024, the Commission issued Notices of Determination to 9 of the 10 designated VSPS in accordance with Section 139L of the 2009 Act. The Notice sets out that the deadline for implementation of Part A of the Code is 19 November 2024 and the deadline for Part B is of the Code 21 July 2025. Notably, the Commission has not yet made a determination in respect of Reddit.

What is in the Code?

The Code itself is split into two parts. Part A sets out the obligations arising from Section 139K of the 2009 Act and includes the measures that VSPS providers must take to protect the general public and children. These obligations apply on a binding basis to VSPSs, depending on the size and nature of the service.

Part B of the Code makes provision for more specific obligations of VSPS providers and sets out the appropriate measures that they must take to provide the protections for children and the general public.

Part A

The purpose of the Code is to ensure that VSPSs do the following:

  • take measures that are appropriate to provide certain protections, particularly in respect of children;
  • comply with certain requirements regarding audiovisual commercial communications that are marketed sold or arranged by them; and
  • take appropriate measures to comply with certain requirements regarding audiovisual commercial communications that are not marketed, sold or arranged by them.

In carrying out its functions, the Commission must act in accordance with following:

  • various statutory objectives and functions;
  • its public law duties to act lawfully, rationally and fairly;
  • the Constitution;
  • the Charter of Fundamental Rights of the EU; and
  • the European Convention of Human Rights.

In doing so, the Commission must endeavour to ensure that the democratic values enshrined in the Constitution, especially those relating to rightful liberty of expression, are upheld, and that the interests of the public, including the interests of children, are protected. In this regard, particular emphasis is made to the Commission's commitment to the safety of children.

The Code sets out that the Commission must, pursuant to Section 7(2)(d) of the Act, endeavour to ensure that it does the following through its regulatory arrangements:

  • addresses programme material, user-generated content, and other content, which are harmful or illegal;
  • take account of technological and societal change;
  • operate proportionately, consistently and fairly

Part A of the Code also restates certain key aspects of compliance and enforcement under Section 139Q of the Act. In particular, that a failure by a VSPS to comply with the Code shall be a contravention for the purposes of Part 8B of the Act.

However, the Code also sets out that in considering whether or not a breach by a VSPS has occurred, the Commission must take into account an obligation, or in the particular circumstances, compliance with an obligation, under this Code would not be practicable or proportionate in its application to the VSPS, taking into account the size of the VSPS and the nature of the service that is provided.

The Code also puts obligations on the VSPS to ensure that they have systems and controls in place to demonstrate compliance with the obligations contained in this Code.

Some of the most important aspects of the Code are the restatement of the specific protective measures that VSPSs must take including:

  • Protecting against the physical impairment, mental impairment, or moral development of children.
  • Protecting against incitement to violence or hatred directed against a group of persons or a member of a group based on:
    • sex;
    • race;
    • colour;
    • ethnic or social origin;
    • genetic features;
    • language;
    • religion or other belief;
    • political opinion;
    • membership of a national minority;
    • property;
    • birth;
    • disability;
    • age; or
    • sexual orientation.
  • Protecting against EU criminal content (child sexual abuse material, terrorism, racism and xenophobia which amounts to incitement to hatred or violence).
  • Protecting against harmful and illegal audiovisual commercial communications.

Part B

In Part B of the Code, there is a particular focus on the appropriate measures that VSPSs must take to provide protections for children. For example, VSPSs are required to implement age assurances measures to ensure that adult-only video content, such as pornography, cannot normally be seen by children and that self-declaration of age is no longer effective. There is also a requirement for VSPSs which allow the uploading or sharing of adult-only video content to implement a content rating mechanism that will allows users to rate the content as not suitable for children.

Parents or guardians will now have much more of a say in how to protect their children as VSPSs who permit users under the age of 16 must provide for parental controls that shall, at a minimum, enable parents or guardians to restrict viewing of video content uploaded or shared by those unknown to the child together with setting time limits in respect of viewing content. VSPSs must also ensure that audiovisual commercial communications which include television advertising, sponsorship and teleshopping must not exploit children. Additionally, Section 17.2 of the Code sets out that personal data of children collected when implementing measures such as parental controls or age verification shall not be processed for commercial purposes.

Transparency is also a key theme of the Code. VSPS must establish transparent user-friendly mechanisms for users to report or flag to the VSPS content that is or communications that is contrary to the Code (Section 15) and put in place effective procedures for the handling and resolution of complaints made by users in relation to the implementation of measures relating to age assurance, content rating, parental controls and reporting and flagging (Section 16). 

What is not addressed in the Code?

Recommender systems, also known as algorithms, are used by VSPS to determine what content is shown to its users. The Commission's draft Code published in December 2023 included a requirement for social media companies to turn off recommender systems. However, the revised draft Code and now adopted Code does not address recommender systems. When the revised draft Code was released in May, the Irish Council for Civil Liberties ("ICCL") provided a statement criticising the Commission 's "dangerous U-turn" stating that "Recommender systems push hate and extremism into people's feeds and inject content that glorifies self-harm and suicide into children's feeds". This was a view shared by CyberSafeKids. The Commission have stated that they recognise the harmful impact that recommender systems can have on users, particularly children, however "the proposal to have recommender system safety plans was consulted on as a supplementary measure and was not intended to be in the first Code." The Commission also alluded to the specific provision in the Digital Services Act ("DSA") that online platforms are bound by in respect of risk assessment, transparency and content moderation. For example, where the online platform uses a recommender system, such as algorithms to recommend content or products to users, it must set this out in its terms and conditions and include information on any options to influence or modify the parameters being used.

While disinformation is also not covered specifically by the Code, each VSPS shall publish an action plan specifying the measures it will take to promote media literacy that shall be updated annually. The Commission's thinking behind this is that increased media literacy will assist users in recognising disinformation. The Commission are also relying on the obligations outlined in the DSA that require larger platforms have to assess the risk their services present in areas such as electoral integrity and public health, which will help to address the impacts of disinformation and misinformation.

There is no time period specified for reviewing the Code however the Commission have said that the Code may be required to be reviewed as developments in VSPSs and Consumer trends arise.

Enforcement of the Code

Enforcement begins with supervision. The Code sets out that the Commission will actively supervise VSPSs to ensure that they live up to their obligations under the Code, in a similar way that they do currently in the context of their Online Safety Framework. If the Commission identify any issues of concern, they have indicated that they will investigate the issues and, if necessary, proceed with enforcement measures.

The Commission has established a Contact Centre which can receive queries and complaints related to its powers under the Online Safety Framework. Here the Commission will consider queries and complaints with a view to either resolving them through supervisory dialogue or if necessary to escalate the matter to its investigation and enforcement team.

Importantly, the Code sets binding rules which holds platforms accountable for how they keep their users safe from harmful video content, backed up by strong sanctioning powers and fines of up to €20 million or 10% of a platform’s annual turnover, whichever is greater.

The Code is an important next step in the strengthening the protections in the online space, particularly in the case of children. The potential penalties and sanctions which may be imposed on VSPSs demonstrates the necessity for them to act in a proactive manner to ensure that they operate in accordance with the Code to avoid the risk of reputational damage, fines and to ensure that the online environment becomes a safer space for everyone.

What next?

Nineteen months after its creation, the Commission, which is also Ireland's Digital Services Coordinator under the DSA, now has its Code and legislative framework in place. The next stage of regulation in the online sphere commences…

The Code is available to view in full here.

If you require any advice or assistance, please do not hesitate to contact Sinéad Taaffe, Damien Watson, or Tom Clarke of this office.

Written by: Sinéad Taaffe, Damien Watson and Tom Clarke

Areas of Expertise

Public and Regulatory