Search

UK: The Online Safety Act is now law; Ofcom’s powers as online safety regulator have officially commenced

posted on 03 November, 2023   (public)

Ofcom publishes implementation roadmap outlining its systemic approach

News item updated with link to consultation on illegal harms

On 26 October 2023, the Online Safety Bill, which aims to establish a new regulatory regime to address illegal and harmful content online in the UK, received Royal Assent (the final stage by which a bill becomes law in the UK) and thus became an act of parliament

From that point onwards, Ofcom has formally taken on its role as the regulator for online safety. For that purpose, also on 26 October 2023, the UK communications regulator published a document that sets out its regulatory approach as well as an implementation roadmap

 

Three successive implementation phases: 

  • Phase 1: Illegal harm duties

9 November 2023: consultation on illegal harms – incl. child sexual abuse material, terrorist content and fraud. This will contain proposals for how services can comply with the illegal content safety duties and draft codes of practice which will set out measures that regulated services can take to mitigate the risk of illegal harm.
  • Phase 2: Child safety, pornography, and protecting women and girls

December 2023: consultation setting out draft guidance for services that host pornographic content. Online pornography services and other interested stakeholders will be able to read and respond to Ofcom's Draft Guidance on age assurance
Spring 2024: consultations relating to the child safety duties based on draft codes of practice relating to protection of children, 
Spring 2025: draft guidance on protecting women and girls.
  • Phase 3: Transparency, user empowerment, and other duties for categorised services

NB: A small proportion of regulated services will be designated Category 1, 2A or 2B services if they meet certain thresholds set out in secondary legislation to be made by Government. The final stage of implementation focuses on additional requirements that fall only on these categorised services. Those requirements include duties to produce transparency reports, provide user empowerment tools, operate in line with terms of service, protect certain types of journalistic content and prevent fraudulent advertising.
Spring 2024: Advice regarding categorisation (to the Secretary of State), and draft guidance on Ofcom's approach to transparency reporting. These duties – including to publish transparency reports and to deploy user empowerment measures – apply to service which meet certain criteria related to their number of users or high-risk features of their service. 
End 2024: A register of categorised services (this will require prior secondary legislation setting the thresholds for categorisation), 
Early 2025: draft code of practice on fraudulent advertising
Mid-2025: transparency notices 

End 2025: final codes and guidance published 

 

A systemic approach:

The Online Safety Act 2023 has similar aims to those of EU Digital Services Act (DSA): the Act  makes companies that operate a wide range of online services (including user-to-user services, search services and pornography services) legally responsible for keeping people, especially children, safe online. The Act places duties on companies to properly assess and manage safety risks arising from content and conduct on their sites and apps (duty of care). It does not expect harmful and illegal content to be eradicated online, but it does expect services to have suitable measures to keep adults and children in the UK safe in proportion to the size of the risk, and the resources and capabilities available to them. A key aspect of these rules is that they seek changes in services’ systems and processes.

In practice, this means that most service providers will need to:

  • carry out risk assessments, 
  • take effective steps to manage and mitigate the risks identified by these assessments;
  • be clear in their terms of service about how users will be protected;
  • provide means for users easily to report illegal content and content harmful to children; and for them to complain; and
  • consider the importance of protecting free expression and privacy, in meeting their new duties.

The online safety regime will be funded by fees paid by providers of regulated services whose qualifying worldwide revenue meets or exceeds a certain threshold, and who are not otherwise exempt.


In implementing the act, Ofcom will be guided by a systemic approach:

  • Ofcom's role is not to instruct firms to remove particular pieces of content or take down specific accounts, nor to investigate individual complaints but to tackle the root causes of online content that is illegal and harmful for children, by improving the systems and processes that services use to address them.

  • Seeking systemic improvements will reduce risk at scale, rather than focusing on individual instances.

 


For a comparison on the services and the harms in scope of the UK Online Safety Bill with the Digital Services Act (DSA), the AVMSD and TERREG see the CERRE report publication "Overlaps - Services and Harms in scope: a comparison between recent initiatives targeting digital services authored by Michèle Ledger and Sally Broughton Micova (November 2022)

 



Message for EPRA members
: Ofcom will be delivering a webinar in December for any regulators who might wish to find out more about the Act and ask questions. If you are interested, please approach Jordan Ogg

Countries