What you need to know about the EU’s landmark digital content law

A new piece of legislation passed in the European Union, known as the Digital Services Act, requires digital companies to crack down on illegal and problematic content. Picture: Pexels

A new piece of legislation passed in the European Union, known as the Digital Services Act, requires digital companies to crack down on illegal and problematic content. Picture: Pexels

Published Feb 16, 2024


The EU's milestone legislation known as the Digital Services Act demands that digital companies crack down on illegal and problematic content.

Since August the law has applied to very large platforms, with more than 45 million active monthly users in the European Union, and the world's biggest tech firms face major fines if they fall foul of it.

The mammoth law kicks in for all companies from Saturday, although with partial exemptions for the smallest firms.

The European Commission has already unleashed a wave of investigations to probe what digital giants have done to comply, with more actions expected.

Here are the regulation's key elements:

Rules for all platforms

Among their obligations, all platforms must quickly remove illegal content or make access to it impossible as soon as they become aware of the issue.

They must also rapidly inform the authorities when they suspect a criminal offence that threatens people's lives or the safety of others.

Every year companies must publish a report that provides details about actions taken on content moderation and how long they took to respond after notification of illegal content. They will also report on the decisions taken in disputes with users.

The law tells platforms to suspend users who frequently share illegal content such as hate speech or fake ads, while online shopping sites must verify the identities of users and block repeat fraudsters.

And there are tougher rules on targeted advertising, with a ban on such ads for children aged 17 and under.

The EU also wants users to see how their data is used, and the law bans targeted advertising based on sensitive data, such as ethnicity, religion or sexual orientation.

The law's more onerous obligations do not apply to small companies -- defined as having fewer than 50 staff and a turnover of less than 10 million euros.

Extra rules for large platforms

The EU has named 22 "very large" platforms including Apple, Amazon, Facebook, Google, Instagram, Microsoft, Snapchat, TikTok and clothing retailer Zalando as well as three major adult websites.

Amazon and Zalando have launched legal challenges to their designations, while Meta and TikTok are challenging a fee to pay for enforcement.

These large platforms must assess the risks linked to their services with regards to the spread of illegal content and privacy infringements.

They must also set up structures internally to mitigate such risks, such as improved content moderation.

And the platforms must give regulators access to their data so officials can see whether they are complying with the rules.

This access will also be shared with approved researchers.

Firms will be audited once a year by independent organisations -- at their own expense -- to ensure compliance, and also establish an independent internal supervisor to keep an eye on whether they are in line with the rules.

Complaints, penalties

The DSA wants to make it easier for users' complaints to be heard.

Users will be able to lodge a complaint claiming a platform is in violation of the DSA with their competent national authority.

Online shopping sites may be held responsible for any damage from products bought by users that are non-compliant or dangerous.

Violations can be met with fines that could go up to six percent of a company's global turnover, and for repeated non-compliance, the EU can even decide to ban offending platforms from Europe.

The commission will be able to sanction "very large" platforms.

EU, national coordination

Under the law, the EU's 27 member states must assign a competent authority with the powers to investigate and sanction any violation by smaller companies.

These authorities must work with each other and with the commission, the EU's executive arm, to enforce the regulation from February.

If a digital platform provider is located in one member state, that country must enforce the rules except for very large platforms which will come under the commission's supervision.