The Reuters Investigation: Profits from Fraudulent Ads
According to Reuters, internal documents suggest Meta has profited significantly from scam advertisements targeting consumers across the EU and beyond. The investigation reveals that Meta’s platforms served an estimated 15 billion high-risk scam ad impressions daily, including fake investment schemes, counterfeit goods, and illegal gambling offers. Internal documents suggest Meta earned up to $7 billion annually from these ads, despite public claims of aggressive anti-fraud measures. The investigation underscores a systemic issue: platforms monetising harmful content, exploiting the trust users place in social media platforms, while consumers bear the risk. Meta has denied the allegations and stated that it actively combats fraudulent advertising.
Beyond reputational damage, such practices erode consumer confidence in digital advertising and expose businesses to legal and financial risks when their brands are misused in fraudulent campaigns.
This issue is not abstract for Malta. Local reports show that online scams have surged dramatically, with individuals and businesses suffering significant financial losses. Recently, deep-fake driven scams which used fabricated videos of politicians on Facebook to promote bogus investment platforms hit Malta. The result was the victim ended up transferring €53,000 from her lifesavings, according to local media reports.
The intersection of global platform failures and local consumer vulnerability underscores the urgency of robust compliance and consumer protection measures.
The EU’s Regulatory Response: Digital Services Act in Action
The European Commission’s recent press release confirms that enforcement under the Digital Services Act is no longer theoretical—it is happening. The DSA imposes strict obligations on VLOPs like Meta and TikTok including:
- Risk Assessments: Platforms must identify and mitigate systemic risks, such as the dissemination of illegal content and fraudulent advertising.
- Transparency Requirements: Clear reporting on content moderation and advertising practices.
- Accountability Measures: Independent audits and compliance reporting to ensure adherence to EU standards.
Failure to comply can result in fines of up to 6% of global annual turnover, alongside reputational harm and potential operational restrictions within the EU.
In its preliminary findings, the Commission found:
- Lack of user-friendly and easily accessible mechanisms for users to report illegal content
- The use of “dark patterns” or deceptive interface designs in such reporting mechanisms, making it confusing for users to report
- Decision appeal mechanisms on Facebook and Instagram do not provide reasons for account suspensions or content removal
A notable increase in the suspension of Facebook business accounts has been evident over the past year, with many such cases resulting from automated actions on the part of Meta without clear explanations for the suspension. This leaves businesses faced with an immediate halt to advertising, e-commerce and client engagement.
Legal Implications
The allegations against Meta are not merely reputational; they carry potential civil and regulatory liability for both platform and businesses that fall victim to fraudulent scams or their customers who fall victim to such scams. Key legal implications include:
- Consumer Protection Breaches: Fraudulent ads may constitute unfair commercial practices under EU law, exposing platforms to liability for facilitating misleading content.
- IP Infringement: Businesses whose brands are misused in scam ads may pursue claims for trademark infringement and reputational harm.
- Financial Losses: Businesses may suffer financial harm if fraudulent ads impersonate their brand or divert customers.
- Reputational Damage: Association of the business with fraudulent ads can erode consumer trust.
- Contractual and Compliance Risks: If businesses advertise on platforms that fail to suspend fraudulent accounts, they may face contractual disputes over service quality.
- Potential Claims Against Platforms: Businesses harmed by such failures of platforms to implement effective notice-and-action and account suspension mechanisms may have grounds to proceed against the platform.
Actual liability will depend on the specific circumstances, contractual terms, and applicable legal framework.
What this Means for Businesses
For companies leveraging online advertising, these developments are a wake-up call. While platforms face regulatory pressure, businesses must also exercise due diligence:
- Monitor Ad Placements: Ensure your brand is not associated with fraudulent or misleading campaigns.
- Implement Internal Controls: Adopt robust monitoring systems to detect misuse of intellectual property in digital ads.
- Secure Account Access: Restrict account access to authorised personnel and use strong authentication methods, such as multi-factor authentication, to prevent unauthorised logins
- Cybersecurity training: Equip staff with the knowledge to recognise phishing attempts, fraudulent ads, and suspicious activity related to online advertising and social media.
The Meta case illustrates a broader trend: regulators are determined to hold platforms accountable for systemic risks. For businesses, this is an opportunity to strengthen compliance frameworks and safeguard consumer trust.
How we can help
Our team of technology and data lawyers recognises the growing complexities of digital compliance. Our team leverages their specialized knowledge in these fields to advise businesses on:
- DSA Compliance Strategies: Helping clients understand obligations and mitigate risks.
- Regulatory Audits: Assessing exposure to fraudulent advertising and platform liability.
- Brand Protection: Legal remedies against misuse of trademarks and intellectual property online.
Our recent achievements in lifting suspensions and restrictions on Facebook accounts for clients illustrate that viable solutions exist for businesses encountering challenges on online platforms.