The Evolution of the UK Online Safety Bill: What’s Next?

Latham & Watkins LLP
Contact

Latham & Watkins LLPThe bill has been introduced into the UK’s Parliament with various amendments to the initial draft published in May 2021, reflecting the extensive feedback received and the challenges in reaching a consensus.

In March 2022, the UK government formally introduced the amended Online Safety Bill into Parliament (the Bill). The Bill features a number of substantial amendments to the government’s initial draft of the Online Safety Bill published in May 2021 (the Draft Bill), as explored below. For background on the broader development of the Online Safety Bill, see Latham & Watkins’ blog series, including a post summarising the Draft Bill.

Whilst the overall structure of the Bill does not materially diverge from the Draft Bill, the UK government has introduced a number of amendments to the scope. The Bill would apply to “user-to-user services” and “search services” that are linked to the UK (i.e., services that either (i) have a significant number of users in, or are targeted towards users in, the UK; or (ii) are capable of being used in the UK and gives rise to a material risk of significant harm to individuals in the UK). The Bill would impose a number of statutory duties of care and other obligations on the providers of regulated services, broadly to protect users from illegal content, and in certain circumstances from harmful content, generated and shared by other users. These obligations would require providers to implement adequate processes to protect users from such content and to moderate the service to detect certain specific, defined content.

While the Bill maintains the broad principles of the Draft Bill, the government has introduced a number of material amendments, as set out below.

  • Fraudulent Advertising: The Bill would require Category 1 and Category 2A[1] services to implement proportionate systems and processes designed to prevent users from encountering fraudulent adverts on the service; to minimise the length of time such fraudulent advertising is present on the service; and to swiftly take down such content once alerted to its presence. Service providers would also need to make publicly available clear and accessible information about any proactive technology (i.e. content moderation technology, user profiling technology and behaviour identification technology) used for these purposes. In addition, “paid-for advertising” has been removed from the definition of user-generated content in the Bill to ensure any paid promotions that use user-generated content (e.g., a boosted post) is still subject to the Bill’s safety duties.
  • User Verification and User Empowerment: The Bill would require Category 1[2] service providers to offer all adult users of the service the option to verify their identity. This cannot be mandatory for access to the service but is intended to provide adults with the option not to interact with unverified users. The Bill will also require Category 1 service providers to implement proportionate features for adult users to increase their control over harmful content.
  • Faster Enforcement: The Bill introduces a number of offences that the Draft Bill deferred to secondary legislation, including offences for incitement to and threats of violence, hate crime, harassment, and stalking. With the inclusion of these offences in the Bill itself, service providers would be clear from the outset of their obligations, and Ofcom would be empowered to enforce against companies which fail to remove the named illegal content earlier in the implementation process, rather than waiting for such offences to be listed in secondary legislation. Service providers would be required to assess their services for the risk of illegal content, implement systems and processes to mitigate the identified risks, and protect users at an earlier date than would have been possible under the Draft Bill, which deferred this to secondary legislation. In particular, criminal sanctions for failure to comply with information notices, from whom / who enforce including senior manager liability, would be introduced under the Bill once it receives Royal Assent, instead of being deferred to secondary legislation.
  • Proactive Technology: Under the Bill, Ofcom would be able to recommend “proactive technology” and set its expectations for its use in codes of practices. This amendment aims to promote the use of accurate and effective tools that meet the standards required by Ofcom in addressing online harm.
  • Harmful Content: The Bill would amend the definition of content that is harmful to adults; it would require the UK Parliament to approve all categories of such content in secondary legislation. Category 1 services would still be required to report emerging harms to the regulator and if appropriate, these harms would be added to the definition via secondary legislation.

Next Steps

The UK’s new Prime Minister, Liz Truss, confirmed in a statement to Parliament on 7 September that her government will proceed with the Bill, albeit potentially with “some tweaks”. As the Bill progresses through the UK legislative process, it will be subject to further debate and amendment. Given the comprehensive nature of the Bill and the novel legal issues it seeks to address, the government faces a potentially challenging and lengthy process to reach a consensus. As a result, the timeline to enactment is not yet clear, though the legislative activity leading up to the introduction of the Bill into Parliament, and the Prime Minister’s confirmation of the Bill earlier this month, indicates a revival of interest in ensuring the enactment of the Bill.

The Bill still leaves a number of aspects to be detailed further in secondary legislation and/or Ofcom codes of practice. (Examples include the thresholds for categorisation of providers into Category 1, 2A, or 2B; the scope of regulated harmful content; and the fine print on Ofcom’s expectations of how providers should comply with their duties of care and obligations in practice). Therefore, once the Bill itself is enacted, the relevant regulations and codes of practice would be produced before any enforcement activity is seen. The publication of the Ofcom codes of practice would ultimately bring clarity for organisations on the compliance measures expected of them in practice, and the practical implications and risks of the Bill.

EU Context

In parallel to the UK government’s work on the Bill, EU legislative bodies have reached political agreement on the EU Digital Services Act (the DSA). The DSA seeks to regulate various aspects of online platforms, including liability and responsibility for user content. However, the DSA diverges from the Bill in its focus on illegal content as opposed to legal but harmful content. In relation to illegal content, the DSA will require platform providers (including providers of user-to-user services regulated by the Bill) to implement notice and take-down mechanisms to facilitate the removal of such illegal content. The DSA does not specifically define or directly regulate harmful content. Instead, it will require the largest online platforms (those classified as “very large online platforms” under the DSA) to identify, assess, and mitigate systemic risks relating to their platforms, including risks around the dissemination of illegal content, negative effects on children, and serious negative consequences to a platform user’s physical and mental well-being.

In addition to the DSA, the European Commission has proposed a regulation to prevent and combat child sexual abuse online (the Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse COM/2022/209), which includes proposed obligations on platforms to assess, mitigate and report the risks of their platform being misused for the purposes of child sexual abuse, as well as obligations to detect and remove such infringing material in the event they receive a “Detection Order” or “Removal Order” from the relevant authorities. This regulation is at the early stages of the EU legislative process, and will be subject to further debate and potential amendments before coming into force.

Organisations potentially impacted by both the UK and the EU online safety regimes should start to assess their platforms and services against the emerging requirements, in preparation for the final legislation and more detailed guidance on compliance expectations to be published in due course.

Latham & Watkins will continue to monitor developments in this area.

Endnote

[1] The exact threshold conditions for Category 1, Category 2A, and Category 2B will be determined by Ofcom “as soon as reasonably practicable” after the first sections of the Bill come into force (see Section 81, the Bill).

[2] As above.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Latham & Watkins LLP | Attorney Advertising

Written by:

Latham & Watkins LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Latham & Watkins LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide