The Digital Complication

The Silencing

Interactive exhibit — Who controls the megaphone

On 25 March 2026, a Los Angeles jury found Meta and YouTube liable for intentionally building addictive platforms that harmed a young woman's mental health. She was awarded $6 million.

On 9 April 2026, Meta began removing advertisements from law firms seeking to represent other victims of social media addiction. The ads ran on Facebook, Instagram, Threads, and Messenger. The platforms that caused the harm silenced the people trying to fix it.

This is not new. This is how power has always worked.

Source: Axios, 9 April 2026. Dan Primack. Kaley v. Meta Platforms Inc. & Google LLC, Los Angeles Superior Court, March 2026.
01
The Statement
Meta spokesperson, 9 April 2026
"We will not allow trial lawyers to profit from our platforms while simultaneously claiming they are harmful."
Meta Platforms Inc., via Axios

Read it again slowly.

They are not denying the harm. They are not disputing the verdict. They are not arguing that their platforms are safe for children. A jury already decided that.

They are saying: if you accuse us, you don't get to use our megaphone.

The platform that a jury found liable for addicting a child is now deciding which lawyers are allowed to reach other children's parents. Not a court. Not a regulator. The defendant.

02
Fifteen Days
From verdict to silencing
25 Mar 2026
Verdict. A Los Angeles jury finds Meta and YouTube liable for intentionally building addictive platforms. A young woman who started using Instagram at age nine is awarded $6 million.
26 Mar - 8 Apr
Recruitment. Law firms across the country begin running ads on Facebook, Instagram, Threads, and Messenger, seeking clients who were harmed by social media as minors. Major firms including Morgan & Morgan and Sokolove Law.
9 Apr 2026
Silencing. Meta deactivates more than a dozen such advertisements. The ads were running on the same platforms the jury found caused the harm. Meta took the advertising revenue, then pulled the ads.

Fifteen days. That's how long it took to go from "found liable" to "controlling who's allowed to talk about it."

03
The Clause
Written by Meta. For Meta.
Meta Terms of Service
"We also can remove or restrict access to content, features, services, or information if we determine that doing so is reasonably necessary to avoid or mitigate adverse legal or regulatory impacts to Meta."

This is not a content policy. It is not about misinformation, or hate speech, or child safety. It is a self-preservation clause.

They wrote the terms. They own the platform. They decide what "adverse" means. And when a jury found them liable for harming children, they used their own terms to remove the ads that might lead to more verdicts against them.

There is no equivalent restriction in their advertising standards. They didn't need one. The terms of service sit above everything, and they wrote those too.

04
The Megaphone
Try speaking into it

Type a message below. Any message. See what happens to it.

05
The Pattern
The mechanism changes. The pattern doesn't.

Meta is the freshest example. It is not the first. Whoever controls the channel of communication controls which truths get heard. The technology changes. The instinct doesn't.

Print
The Editor
Stories that threaten advertisers get buried on page 34. Investigations get spiked. Uncomfortable truths wait until the timing is convenient for someone other than the people who need to hear them.
The story existed. The front page didn't carry it.
Broadcast
The Producer
The satellite feed drops at convenient moments. The interview gets cut short. The follow-up question doesn't air. The panel is stacked. The segment runs at 11pm when nobody is watching.
The interview happened. The broadcast didn't carry it.
Institutional
The Inquiry
The Post Office knew its Horizon software was convicting innocent people. Subpostmasters went to prison. Families were destroyed. The truth was available. The institution that held it chose silence for over a decade.
The evidence existed. The institution didn't release it.
Regulatory
The Regulator
The watchdog funded by the industry it oversees. Staffed by people who used to work there. Investigations that take years. Fines that amount to a rounding error on a quarterly earnings call. Reports published long after the damage is done and the executives have moved on.
The regulator existed. The industry it watched set the pace.
Platform
The Algorithm
Meta removed the lawyers' ads from the platforms the jury said caused the harm. They didn't need a phone call to an editor. They didn't need a D-notice. They wrote it into the terms of service and let the system do it at scale.
The ads existed. The platform decided you couldn't see them.
The difference
The Scale
A newspaper editor could bury one story. A TV producer could cut one interview. A government could classify one document. A platform can silence a message across 3 billion accounts in a single afternoon.
Same instinct. Industrial capacity.
06
The Volume
Control the channels. Control the truth.

Every channel of information has a volume knob. Someone is always holding it. Drag the sliders and watch what happens to the headline below.

Social reach
100
Press coverage
100
Legal access
Public record
100
Meta and YouTube found liable for intentionally building addictive platforms that harmed children. Jury awards $6 million. Law firms seek additional plaintiffs.
07
The Truth

Silencing isn't a red stamp on a document. It isn't a man in a suit saying "you can't print that." It isn't dramatic. It isn't cinematic.

It's the quiet removal of something you never knew was there.

It's the ad that doesn't appear. The story that runs on page 34. The inquiry that takes twelve years. The document released after everyone who could be held accountable is dead. The algorithm that simply stops showing it to you.

You can't be outraged by something you never saw. That's the point.

The most effective silencing doesn't feel like silencing. It feels like nothing happened at all.

Previous
The Addiction Factory
They built the machine. They knew what it would do.
Next
I Invited Nobody
The proof. It was never AI. It was the humans behind it.