The European Union is on the verge of rewriting the rules of the road governing how search engines, online marketplaces, social media and other web platforms search out and remove illegal content.
The Digital Services Act (DSA) has one overarching principle, according to a European Council press release: “What is illegal offline should also be illegal online.”
This is a wide net, covering everything from child sex abuse images and terrorist content to the sale of dangerous goods, to copyright violations. And new age verification requirements and parental controls to protect children are in the mix too.
The legislation is a companion to the Digital Markets Act (DMA) overseeing the collection and use of personal data by tech giants. While the DSA does impose special responsibilities on the likes of Google, Facebook, Apple and Amazon, it is far broader, impacting any company — such as an internet service provider (ISP) — that is an intermediary between web users and the content they seek.
Read more: 6 Ways the EU’s Digital Markets Act Will Change Big Tech
Like the DMA, the DSA has sharp teeth. Financial penalties for the “very large online platforms” used by 10% or more of the EU’s citizens can be as high as 6% of global revenue. But, while the law is more stringent for large platforms, it applies to all but the smallest.
However, its path has not been smooth, and various interest groups are still fighting for changes.
Safe Harbor
As ever, the biggest battle is over platforms’ responsibility for removing illegal and copyright-infringing content.
While ISPs and content platforms will remain free from liability for content posted by users, the speed with which they must act is likely changing. Depending on who you ask, this is far too fast or far too slow.
According to the open-web focused Electronic Frontier Foundation (EFF), liability could change from the current requirement that infringing content be removed expeditiously after the platform is made aware of the violation to a de facto 24 hours if liability is to be limited.
This is so fast it would force platforms into rubber-stamping any infringement complaint by using an automatic filter, the EFF said.
Unsurprisingly, the International Federation of the Phonographic Industry (IFPI), a recording and content industry umbrella organization, sees things differently. The DSA’s safe harbor provisions “remove all incentives for search engines to stop enabling access to illegal or harmful content — and make money on the back of such activity” the IFPI said in a statement.
This is particularly true of how the DSA regulates search engines, which the IFPI fears will make “them beneficiaries of a broad and unjustified ‘safe harbor’” and “would remove all incentives for search engines to stop enabling access to illegal or harmful content — and make money on the back of such activity.”
More Disclosures
Despite these fights, the DSA legislation — at this point — also lets in more light. Notably, that includes requiring the largest platforms to disclose how content is being moderated, and in what languages. They will also have to make public information about the algorithms they use to make recommendations, and give researchers access to data about how they work.
Platforms would have to tell content creators if a platform has limited the visibility of a post or shut off monetary payments for viewers. This would notably be important for ad-supported creators on YouTube, TikTok and other social media sites.
The EU also said the DSA will make it easier for users to challenge the big platforms’ content moderation decisions and force large commerce platforms to vet third-party sellers more carefully.
The responsibility of notifying authorities of suspicion of serious criminal offenses will be expanded from online platforms to all hosting services, and the DSA will also shift enforcement duties for the largest platform from national regulators to the European Commission, while leaving others to local enforcement authorities.