Meta has removed a series of advertisements from its platforms that were placed by law firms seeking clients for potential lawsuits related to social media addiction, marking a new flashpoint in the ongoing legal and reputational battle surrounding its services.
The decision follows mounting legal pressure on the company. Meta recently lost two major cases in the United States, including a high-profile trial in California in which a young woman successfully sued over claims that social media contributed to her childhood addiction.
In response to the adverts, Meta drew a firm line on how its platforms can be used.
“We will not allow trial lawyers to profit from our platforms while simultaneously claiming they are harmful.”
Law firms affected by the move have pushed back, arguing that the decision reflects an attempt to shape public perception rather than address underlying concerns.
Emily Jeffcott, an attorney at Morgan & Morgan, described the action as “another example of Meta trying to control the narrative and avoid accountability”.
She added: “The resources Meta is devoting to blocking these ads would be better spent improving user safety through functional tools to reduce problematic use and to detect and remove users under age 13.”
“Blocking the ads doesn’t make the harms go away. It just makes it harder on victims.”
According to reports, firms including Morgan & Morgan and Sokolove Law saw dozens of their adverts removed across platforms such as Facebook, Instagram, Threads, and Meta’s Audience Network. Despite the crackdown, some adverts remain visible in Meta’s Ad Library, including campaigns that highlight potential harms of social media use while promoting legal action.
Meta’s advertising policies give the company broad discretion to remove content that could harm its relationship with users or conflict with its business interests — a provision that appears to underpin this latest move.
The backdrop to this decision is a shifting legal landscape. Courts in the United States are increasingly willing to scrutinise the impact of social media on younger users. In March 2026, a New Mexico court ordered Meta to pay $375 million for misleading users about the safety of its platforms for children, citing exposure to explicit content and contact with predators.
In a separate California case, a jury awarded $6 million in damages to a woman over social media addiction claims, with Meta expected to cover the majority share. Other companies initially named in the lawsuit, including Snap and TikTok, reached settlements before the case went to trial.
Meta has stated it intends to appeal both rulings and maintains its disagreement with the outcomes.
The clash raises a broader question for the industry: when platforms host both the alleged harm and the legal response to it, who controls the narrative — and where should the boundary be drawn?
Author: George Nathan Dulnuan
