The collapse of the incentive to make
✦
We can no longer easily1 tell whether we're interacting with a human or a machine. More and more of what gets posted isn't written by people. For (roughly) every 31 human visits to a website, there is a bot visit. Reddit, X (more than ever) and even HackerNews are getting flooded with AI-generated spam, karma farming and SEO slop: content that costs essentially nothing to produce and is only getting cheaper.
Almost two decades ago, my 11-year-old self started a blog on Blogger, in half-broken English, just to write about the games I was playing. Not for an audience, certainly not for revenue, but simply to put thoughts somewhere. The internet I grew up with eventually became something I like a lot less: a place optimized for attention, engagement and monetization. But even that version had a deal that made sense:
- Write something useful;
- People find it;
- The effort pays off.
But when a model can scrape your article, summarize it and hand the answer to someone who never visits your site, that deal is dead, and that collapse creates a weird second-order problem2.
Models are trained on the web—Reddit threads, expert blogs, detailed product reviews—content that existed because real people had a reason to make it. If that reason disappears, the web fills up with AI-generated text instead and the next wave of models trains on that. To be clear, I don't really give a fuck about the quality of the next generation of models, but I do care about the quality of the internet.
The web was^†^ the accumulation of people who had something to say and a reason to say it. Maybe the ones who keep making things will be the ones who never needed that reason in the first place: the ones who just wanted to put thoughts somewhere.
Footnotes
-
For now. There are still tell-tale signs of AI writing, but I can only assume models will only get better at masking them. ↩
-
A second-order effect is one that is caused not directly by the thing itself, but by its consequences. ↩