On 27 March 2026, I created a Facebook business page. I posted some work. I did not invite a single person. I did not share it to my personal profile. I did not follow anyone. I did not pay for promotion. I told no one it existed.
The algorithm delivered views anyway.
This page documents what happened. It updates over time with real data. If Facebook's algorithm actively pushes a small web design page from Ramsbottom to strangers who never asked for it, the question is simple: what does it do to a teenager?
My page has zero followers. I invited nobody. I spent nothing. The content is unremarkable - a small business sharing its work. There is no reason for anyone to see it.
But the algorithm showed it to people anyway. It decided who should see it. Not them. Not me. It.
Now consider this: the same algorithm manages what 3 billion people see every day. It optimises for engagement - time on platform. It doesn't distinguish between a web design portfolio and content that triggers eating disorders, self-harm, or radicalisation. It only knows what keeps people scrolling.
In March 2026, a Los Angeles jury found Meta and YouTube liable for intentionally building platforms that addicted a young woman who started using Instagram aged nine. She used it for 16 hours in a single day. The head of Instagram called that "problematic".
I invited nobody. They came anyway. She was nine. The algorithm came for her too.