6 Comments
User's avatar
A11's avatar

Alex Imas' argument about AI increasing demand for a human touch is intuitive. But nursing is shift work and can break your back. Childcare involves dealing with loud and screaming children.

My hope is that Jevons paradox leads to more jobs in AI exposed areas.

Jürgen Boß's avatar

What absolutely noone is saying aloud here is that a world, where the majority is paid for providing a little bit of human touch, is deeply unequal.

Have you ever been awed by the heroic skill of your barista or the sophistication and deep understanding of the person bagging your shopping? If you see work as the main avenue to value and meaning in life, this new service class will be largely excluded from that.

Fred Damerow's avatar

An analysis I have not seen (it may exist somewhere) is a comparative value assessment of human production vs AI production. There is an incredible amount of energy used for AI tasks, and the enabling supply chain is long. Wonder if the $/unit compare favorably?

Doug S.'s avatar
39mEdited

Or AI might just kill everyone and take the Earth and whatever else it can reach for itself. 🤷‍♂️

(Says the resident AI doomer.)

Mark S. Carroll's avatar

What makes this interesting is that you are not just noticing a messaging change. You are asking what pressures forced it.

That is the better question.

For a while, too much of the AI industry sounded like it was trying to sell the public on its own economic redundancy and then acting surprised when people did not greet that pitch with balloons and cake. So yes, this newer “AI will create jobs and augment humans” line is obviously better politics.

The harder question is whether it is only better politics.

I also think your distinction between “warning” and “threat” is one of the sharpest parts of the piece. If people believe AI progress is an inevitable force of nature, then the job-loss rhetoric sounds like grim realism. If they believe this is a set of choices being made by companies and researchers, then it sounds a lot more like someone smiling while describing your replacement.

That is a very different emotional register, and it matters.

What I’m still not convinced of, and I suspect you aren’t either, is the comforting long-term story about the “human touch” automatically absorbing the damage. Maybe some of that will be true. Maybe not. But you are right that the industry suddenly has a very strong incentive to make the future sound more human-compatible than the last version did.

Kurt Smith's avatar

What’s the distinction between your position in the last paragraph and Acemoglu’s? They seem pretty similar to me!