The EU’s DSA, which regulates online intermediaries and platforms in digital space, came into effect in February 2024.
The purpose of the DSA is to make the online environment safer, fairer, and more transparent by imposing oversight and control obligations on companies operating various platforms that provide intermediary services between content distributed in digital space and users. Its aim is to prevent the dissemination of illegal content (any content that violates the laws of the European Union and the laws of EU member states). Illegal content includes, Inter alia, fraud, incitement of terrorism, pedophilia, and even content involving copyright infringement.
The DSA applies to any company incorporated or operating in the European Union. More importantly, it also applies to any content available to users located in the EU. In other words, the DSA actually applies to any company distributing content online (unless access by European users is disabled).
Obligations under the DSA
The obligations imposed, and the degree of control required of such platforms, depend on their volume of activity and whether they retain the content or merely serve as a conduit:
-
First Category: Intermediary Services
Companies transmitting or temporarily storing third-party content as a “conduit” without any involvement in the information and content. These companies must, inter alia, include explicit restrictions on illegal content in their terms of use (such as content management policies) and act responsibly when implementing and enforcing these restrictions.
-
Second Category: Hosting services
Companies providing data storage services, such as cloud services and online platforms. These companies must, inter alia, implement a mechanism enabling users to report allegedly illegal content and take action with regard to such reports.
-
Third Category: Online platforms
Companies operating online platforms that facilitate unrestricted interactions between sellers and consumers, such as application stores, marketplaces, and social networks. Such companies are subject to many additional obligations, such as reporting obligations to the European Commission about actions carried out in digital space, transparency in advertising, the obligation to implement an internal system for handling complaints about content management, etc.
-
Fourth Category: Very large Online Platforms and Search Engines
Companies operating platforms and search engines with an average monthly volume of users exceeding 10% of all European users (currently, more than 45 million people). These companies are subject to more stringent obligations, such as analyzing any systemic risk deriving from use of their platforms, implementing effective risk mitigation measures tailored to their identified systemic risks, and appointing a formal compliance function in the company to monitor the company’s compliance obligations pursuant to the DSA.
This means that, essentially, DSA obligates companies, whether if tech giants like Google and Meta, or smaller companies offering website-building or online advertising services, to establish transparent content management policies. These policies must be communicated to both users and customers seeking to share content through their platforms.
Enforcement Begins
The sanctions the European Commission and EU member states may impose against companies that violate the DSA are substantial and grave. They can reach to fines up to 6% of such companies’ global annual turnover. It is important to note that DSA enforcement actions have already begun against the mega corporations X (formerly Twitter), TikTok, and, very recently, Meta (Facebook, Instagram, etc.), to examine their compliance with provisions of the DSA.
Drafting a professional and comprehensive DSA compliance policy may well prevent enforcement actions being taken against such companies.