Why this exists
Every day, billions of people open a screen and are immediately hit with a wall of noise. Progress bars that go nowhere. Metrics designed to look important. Notifications engineered to create urgency where there is none. And underneath all of it, an endless stream of content that exists for one purpose: to get you to click.
It isn't accidental. It's by design.
Sales teams, bad actors, governments, corporations. They all have something in common. They've worked out that attention is the most valuable commodity on earth, and they will do whatever it takes to capture yours. Clickbait headlines, manufactured outrage, fake urgency, dark patterns, engagement bait. The internet was supposed to be the greatest library ever built. Instead, we filled it with noise.
The tragedy isn't that the information is worthless. It's that it's become nearly impossible to tell the difference between what matters and what doesn't.
There's a word for this. Enshittification. The deliberate, systematic degradation of something that used to be good. Make it useful first. Get people dependent on it. Then slowly, quietly, make it worse. Charge them to get back what they had for free. Buy any alternative so there's nowhere else to go. Got a favourite website? Fill it with pop-ups. Enjoying a video? Ad break. Bought an expensive phone? Here's an update to make it slower. Got a thousand photos of your mum? That's cloud storage now. Pay up.
It's not a conspiracy theory. It's a business model. And it's everywhere.
The Forbrukerrådet (the Norwegian Consumer Council) made a short film that captures this better than I ever could. It should matter to everyone, everywhere, always. They're also behind Breaking Free, a campaign fighting back against exactly this.
And then there are the people who still remember what the internet was for. Mozilla Foundation, the non-profit behind Firefox, haven't forgotten. While everyone else is chasing what comes next, they're still fighting for what was promised at the start. Their editorial platform, Nothing Personal, is one of the few places still asking the right questions about surveillance, algorithmic manipulation, and what the internet is actually doing to the people who use it.
And here's the thing that should really worry you. It isn't just humans drowning in this. Every AI model being trained today is learning from the same polluted pool. The clickbait, the misinformation, the sponsored content pretending to be journalism, the SEO-stuffed garbage written by machines for machines. AI isn't learning from the best of humanity. It's learning from the loudest.
The Digital Complication is a working model of this problem. Every widget on the page is real. The progress bars progress. The knobs turn. The meters measure. The waveforms wave. It all works beautifully. And none of it means anything.
That's the point.
Below the functional, impressive-looking dashboard sits the chumbox. The "Recommended Content" section. The bit that every website has, tucked underneath the article you actually came to read. "You Won't Believe What This Loading Bar Did Next". It's absurd when you see it in isolation. But this is what the bottom of every news site, every blog, every article on the internet looks like. We've just stopped noticing.
I built this as a reminder. For myself as much as anyone else. The next time you see a dashboard full of impressive-looking numbers, ask what they actually mean. The next time a headline tries to make you angry or curious or afraid, ask who benefits from that emotion. The next time something looks complicated, ask whether the complexity serves you or distracts you.
In watchmaking, a "complication" is any function beyond telling the time. A date display, a moon phase, a chronograph. They're called complications because they add mechanical complexity. But with a good watchmaker, that complexity has purpose. Every gear exists for a reason.
The digital world has complications too. But nobody is checking whether the gears connect to anything.
Making the invisible visible. That's the point of all this. The main piece does it with meaningless dashboards and metrics controlling the news and media. The additional generators, Inked Lin, Consent Theatre, The Ghost Job Machine, The Guru Machine, The Screening Machine, The Addiction Factory, The Silencing, and The Targeting Experiment, do it with the language and patterns people encounter every day but have stopped noticing. The tragedy is that most people fall for it because they depend on it being true. They need the job to be real. They need the course to work. They need the cookie banner to mean what it says. They need the hiring process to be fair. And the people who build these things know that.
This is not a technology problem. It is a human problem.
Every system on this site was built by a person who made a choice. The algorithm that filters your CV didn't decide to penalise your age. A human trained it on biased data and chose not to check the output. The cookie banner didn't design itself to hide the reject button. A product team made that decision in a meeting. The guru sales page didn't write its own countdown timer. Someone sat down and calculated exactly how much pressure a desperate person can take before they hand over their savings.
We blame the algorithm. We blame the AI. We blame the platform. But algorithms don't make choices. People do. The technology is the delivery mechanism. The cruelty is the product. And when the system harms people at scale, when it denies claims, filters out candidates, manipulates the vulnerable, and harvests data, someone somewhere decided that was an acceptable outcome. They just made sure their name wasn't on it.
That's the real complication. Not the technology. The humans who hide behind it.
Every time something terrible happens, the response is the same: restrict, remove, control. Not the person. The information. A teenager radicalised on forums uses an AI chatbot to ask a question he could have found in any library, and the answer is to make the library smaller. The books are not burned anymore. They are quietly removed from the index and buried under noise. The result is the same. Less knowledge for everyone. More control for the few. And the person who was going to do the terrible thing does it anyway.
In 1933, Nazi Germany burned 25,000 books in a single night. During the Chinese Cultural Revolution, entire libraries were destroyed and intellectuals were sent to labour camps for owning the wrong texts. These are not ancient history. They are within living memory.
Fahrenheit 451. 1984. The Memory Police. Bradbury, Orwell, and Ogawa each imagined a world where knowledge was controlled, restricted, or erased entirely. We read them as warnings. We teach them in schools. Then we watch the same patterns play out in real time and call it safety.
The method has evolved. Nobody needs to burn a book when you can bury it in an algorithm. Nobody needs to ban a library when you can put it behind age verification, identity checks, and terms of service that nobody reads. Nobody needs to silence a voice when you can drown it in noise. The fire looks different now. But the shelves are still getting emptier. And every restriction sold as protection is also a transfer of power. Every piece of knowledge removed from public access is a piece of knowledge that only the privileged can still reach.
The websites, applications, tools and chatbots I build at laurencelai.co.uk are not this. Never will be.
Everything you need. Nothing you don't.
A note on intent
The Digital Complication is a work of art and social commentary. The words, phrases, and patterns featured across this site are presented as examples of language commonly used in media, politics, advertising, and online culture. Their inclusion is not an endorsement, an accusation, or a political statement. It is a mirror.
Where specific events, court verdicts, or corporate actions are referenced, they are sourced from published journalism and public record. This is not speculation. It is observation.
The satirical tools on this site parody systemic patterns, not specific individuals or organisations. No claims of fact are made about any named person. Public figures appear only as examples of how names become noise in the modern media cycle.
If any of it made you uncomfortable, that was the point. Not to offend, but to make the invisible visible.
Ramsbottom, 2026
Now see how it works.
Begin with Inked Lin →