EU Investigates Facebook’s and Instagram’s Handling of Disinformation Ahead of Elections
The European Commission has initiated an investigation into Meta Platforms, the parent company of Facebook and Instagram, over alleged failures to curb disinformation and deceptive advertising in the lead-up to the European Parliament elections. The move follows concerns regarding potential sources of disinformation both within and outside the EU.
According to a report by Reuters, EU tech regulators have raised alarms over the proliferation of misleading information — not only from external actors like Russia, China and Iran, but also from political parties and organizations within the EU. These concerns have prompted the European Commission to take action amid preparations for the upcoming elections scheduled for June 6-9.
The investigation is rooted in suspicions that Meta Platforms may be in breach of EU online content rules, particularly the Digital Services Act (DSA), which came into effect last year. Under the DSA, major tech companies are obligated to take more robust measures to combat illegal and harmful content on their platforms, with potential fines reaching up to 6% of their global annual turnover.
Related: Brussels to Investigate Meta Platforms’ Handling of Disinformation on Facebook and Instagram
One focal point of the probe will be the activities of a Russia-based influence operation network known as Doppelganger, which was previously exposed by Meta in 2022. People familiar with the matter, as cited by Reuters, state that the EU investigation aims to assess Meta’s compliance with DSA obligations, particularly regarding the dissemination of deceptive advertisements, disinformation campaigns and coordinated inauthentic behavior within the EU.
In response to the investigation, Margrethe Vestager, the EU’s digital chief, expressed concerns about Meta’s moderation practices and transparency regarding advertisements and content moderation procedures. She stated, “We suspect that Meta’s moderation is insufficient, that it lacks transparency of advertisements and content moderation procedures.”
Meta Platforms, with over 250 million monthly active users in the European Union, defended its approach to risk mitigation, asserting that it has a well-established process for identifying and addressing risks on its platforms. A spokesperson for Meta emphasized the company’s commitment to cooperating with the European Commission and providing further details of its efforts to mitigate risks.
The Commission’s investigation signals a concerted effort to ensure that tech giants like Meta comply with EU regulations aimed at safeguarding the integrity of elections and combating the spread of misinformation and deceptive advertising online.
Source: Reuters
Featured News
Judge Allows FTC Antitrust Case Against Amazon to Move Forward
Oct 1, 2024 by
CPI
SAP Leader Urges Caution on EU AI Rules, Warns of Competitive Disadvantage
Oct 1, 2024 by
CPI
Colorado’s Grocery Workers Unite to Oppose $24.6 Billion Supermarket Merge
Oct 1, 2024 by
CPI
Canada’s Competition Bureau Warns Businesses of Tougher Enforcement
Oct 1, 2024 by
CPI
Top Antitrust Lawyers Launch New Boutique Firm
Oct 1, 2024 by
CPI
Antitrust Mix by CPI
Antitrust Chronicle® – Refusal to Deal
Sep 27, 2024 by
CPI
Antitrust’s Refusal-to-Deal Doctrine: The Emperor Has No Clothes
Sep 27, 2024 by
Erik Hovenkamp
Why All Antitrust Claims are Refusal to Deal Claims and What that Means for Policy
Sep 27, 2024 by
Ramsi Woodcock
The Aspen Misadventure
Sep 27, 2024 by
Roger Blair & Holly P. Stidham
Refusal to Deal in Antitrust Law: Evolving Jurisprudence and Business Justifications in the Align Technology Case
Sep 27, 2024 by
Timothy Hsieh