The European Union has intensified its scrutiny of Mark Zuckerberg's Meta, launching an investigation to ensure the tech giant bolsters its defenses against misinformation and foreign interference, particularly in the lead-up to pivotal EU elections, Breitbart reported.
The European Commission, the EU's executive branch, has targeted Meta, expressing concerns over insufficient safeguards on its Facebook, Instagram, and WhatsApp platforms. These worries encompass the spread of misleading ads, AI-generated deepfakes, and other deceptive content aimed at influencing political outcomes.
EU officials announced the inquiry on Tuesday and underscored their determination to compel Meta to adopt more robust measures to counteract malicious activities targeting the integrity of the upcoming European Parliament elections, slated for June 6 to June 9.
This investigation reflects the EU's stance on holding major tech firms accountable for perceived shortcomings in content moderation — an approach starkly different from the U.S., where free speech protections limit government intervention in online discourse. With the enactment of the Digital Services Act, European authorities now possess significant powers to scrutinize and penalize major platforms like Meta.
Ursula von der Leyen, President of the European Commission, emphasized the imperative for big digital platforms to fulfill their obligations, stating, "Today's decision shows that we are serious about compliance."
At the heart of the inquiry lie apprehensions regarding the effectiveness of Meta's content moderation systems in detecting and eliminating harmful content. Regulators cited findings from AI Forensics, which uncovered a Russian disinformation network procuring deceptive advertisements via counterfeit accounts across Meta's platforms.
Additionally, officials allege Meta may be suppressing the visibility of certain political content, raising transparency concerns regarding content propagation.
While defending its policies and claiming proactive efforts to combat disinformation, Meta expressed willingness to cooperate with the European Commission.
"We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work," Meta said.
The inquiry represents the most recent action by EU regulators utilizing the Digital Services Act, with analogous investigations aimed at TikTok and X. Violators face significant penalties, including fines amounting to as much as 6% of a company's worldwide revenue and the power to conduct office searches for evidence.
The Digital Services Act, enacted on Feb. 17, 2024, mandates internet giants to provide users with information on content recommendations, offer opt-out options, and ensure transparency in advertising. The rules also aim to mitigate risks of election misinformation and manipulation, CNBC reported.
Meta falls under the DSA's Very Large Online Platform (VLOP) category, subjecting it to stricter controls and potentially heavier fines for non-compliance. The European Commission will continue gathering evidence from Meta, including requests for information and interviews, with further enforcement steps possible if necessary.
Jim Thomas ✉
Jim Thomas is a writer based in Indiana. He holds a bachelor's degree in Political Science, a law degree from U.I.C. Law School, and has practiced law for more than 20 years.
© 2025 Newsmax. All rights reserved.