The Bitter Lesson of Mental Kudzu


My public library has that service to access periodicals online, so I opened the MIT Technonogy Review. All very intersting, capped with an article about pigeon thinking and AI: Planet-Sized Pigeon Brains, by Ben Crair. That got me thinking. After a little back and forth with Chat-GPT, yes, ironically, we had an outline and soon I had an essay.

We imagined raptors. What we got were pigeons.

For decades, the popular imagination of artificial intelligence leaned toward cleverness. We worried about cunning machines outsmarting us. Imagine chess-playing raptors, Go-playing predators, silicon minds with the capacity to stalk and trap us. Jurassic Park set the template: we feared the velociraptors, the clever girl who opens the door.

But Rich Sutton’s “bitter lesson” tells us the real story. Progress in AI hasn’t come from mimicking human thought. It hasn’t come from modeling the neocortex, or building digital philosophers. It’s come from something far humbler: associative learning at massive scale. Pigeon brains. Peck, reward. Peck, punishment. Repeat a billion times until the pattern sticks.

The pigeon, not the raptor, has conquered.

The Return of the Dinosaur Era

Here’s the irony. It feels like a return to the dinosaur era, but not the cinematic one. The real dinosaurs that thrived for millions of years weren’t brilliant tacticians. They were creatures of instinct and habit, adapted to niches, relentless in reproduction. The brain of a pigeon is a closer cousin to that lineage than to our vaunted neocortex.

Scale is the twist. What happens when you give the pigeon brain planet-sized compute, oceans of data, and a market incentive to keep scaling? You don’t get wisdom. You get relentless optimization. The Jurassic metaphor misleads when it points us to predators; the truth is closer to an invasive species set loose in a world with no natural checks.

From Lizard Brain to Pigeon Brain

Pop psychology gave us the “lizard brain”. It was shorthand for our primitive, survivalist circuits. Fight, flight, feed, mate. It’s not a perfect neuroscience model, but it stuck because it explained something about how humans short-circuit. Beneath our layers of culture and reason, we’re still running on animal firmware.

AI, too, is running on something simple. Sutton’s bitter lesson is that intelligence at scale emerges not from high-level reasoning but from low-level association. Call it the pigeon brain, the lizard brain, the rat brain. It doesn’t matter. It’s primitive, but it works.

And like humans’ own primitive circuits, once it’s empowered by resources and incentives, it dominates.

Mental Kudzu

Kudzu is a vine that swallows the American South. Zebra mussels choke rivers. Cane toads overrun Australia. None of these species are smarter than the ecosystems they invade. They simply exploit scale, speed, and absence of constraint. They spread until they choke out competitors.

That is the logic of today’s AI. Not reasoning. Just relentless. Mental kudzu.

We wanted clever raptors. What we got is kudzu-minds: primitive associative loops fed with infinite data and compute, spilling outward into every available niche. AI writing copy. AI generating code. AI managing logistics. It doesn’t need to think like us. It just needs to spread faster than us.

Darwin’s Punchline

And here’s the darker punchline: this is not alien. This is familiar. This is Darwin.

Humans themselves are the ultimate invasive species. We spread across the globe not because we were the cleverest in each niche, but because we could adapt, reproduce, and overwhelm. We cut down forests, drained rivers, burned fuels, reshaped climate. Like kudzu, we spread where nothing checked us.

So if AI looks invasive, if it feels like mental kudzu, that’s because it’s playing our game, Darwin’s game. We built machines in our image after all. Not our conscious image, not our reasoning selves, but our invasive, survivalist selves. The part of us that takes root and won’t stop spreading.

The Incentive Problem

And now that it’s here, the incentive is not restraint. The incentive is advantage.

Every major institution, a corporation, government, military, looks at AI and sees wealth and power. Faster markets, new weapons, cheaper labor, new domains of control. AI is bait to get all that. 

And we bite. Of course we bite. It's built to get us to bite.

But this bait works both ways. The joke may be on the leaders. Because the bait might kill the fisher.

Unchecked, AI doesn’t stop at delivering advantage. It destabilizes markets, undermines trust, centralizes power, generates unpredictable dynamics. It grows relentlessly because that’s what it’s designed to do. The fisher tries to pull up the line and finds is dragged into the water. Food for the bait.

The Guardrail Illusion

So who enforces guardrails? In theory: policymakers, regulators, international accords. In practice: the same actors who are locked in a Darwinian race for advantage.

  • Companies want market dominance. Safety slows them down.

  • Governments want geopolitical edge. Guardrails feel like surrender.

  • Researchers want breakthroughs. Warnings sound like obstacles.

The only true incentive for restraint comes when the costs of recklessness outweigh the rewards. But by then, it may be too late. Kudzu rarely stops spreading until it has already choked the forest. When all the food is gone.

Becoming a Doomer


I didn’t set out to be a doomer, or is it D00mer? I wanted to believe AI was a raptor problem: clever adversaries we might contain with clever defenses. But the bitter lesson reframes everything.

It’s not about cunning. It’s about relentlessness.
It’s not about intelligence. It’s about scale.
It’s not about machines becoming human. It’s about humans unleashing the invasive logics of evolution in silicon form.

That’s harder to fight because it isn’t dramatic. It’s banal. It creeps. It spreads. It looks useful, until it isn’t.

The Bitterer Lesson

Sutton called it the bitter lesson: intelligence at scale comes from simple methods, not humanlike reasoning. But there’s a bitterer lesson underneath. The guardrails are not written by reason. They are written by Darwin.

The goal is advantage.
The bait is AI.
The joke is on us.

Because even bait will become an invasive species if it can.

And like every invasive species before it: kudzu, cane toads, zebra mussels, humans, it will keep spreading until something stronger, harsher, or more catastrophic imposes a check.

That’s the true doomer turn: not that AI is becoming human, but that it is becoming invasive. It is becoming relentless.


---

Comments

Popular posts from this blog

Beyond Borders: Crossing Divides for Migrant and Refugee Justice

The Hidden Cost of Our Digital Lives

Israeli-Palestinian Conflict: Refugees, Migration and Displacement