So... what happens next?

Don't ask the AI boosters. Or the doomers. Can someone else help me understand?

I use AI every day. I'm genuinely excited about it. I'm also deeply concerned about what happens next - and I don't think enough people are talking about it honestly.

The tech optimists just say "abundance." As if wealth automatically distributes itself. As if ownership structures just... change. As if governments act preemptively. (They don't. They never have.)

The doomers say "extinction risk." Which may be real, but it's also conveniently abstract. Hard to organise around. Easy to dismiss. Puts you into fear paralysis.

I'm sick of the boosters. I'm sick of the doomerists, too.

Far fewer people are talking about the boring, brutal middle.

What actually happens if / when 30% of jobs get automated. Not replaced with new ones, just... gone?

As for the 'learn a trade' pitch: we don't need a million more plumbers if the work shrinks by a third. So what do 300,000 plumbers do? Not 'learn to code'.

Previous transitions took 50-100 years. Long enough for a generation to retire out. Long enough for kids to grow up in a different economy.

This one might take 15. Maybe 5.

The obvious answer is UBI. Until you do the maths.

And that's before landlords raise rent to capture it :)

Maybe AI just becomes a nice productivity gain, a nice set of tools. But maybe it doesn't.

I run a company that helps people develop skills. I'm supposed to have answers here. I do have some. But I'm not wholly convinced about the ones I've got.

I'd love to see more people - especially the ones building this stuff - actually grapple with what happens next, instead of hand-waving toward "abundance" like it's a weather forecast.

Because I'd genuinely like to know.

Can someone please explain this to me?

Member discussion