24 Comments
User's avatar
jeff's avatar
3hEdited

I swear there's not a econ/tech writer on the planet that has any idea what engineers actually do.

A software engineer is most importantly a *type of engineer*. And by and large engineers are people who define and solve problems. "Coding" is incidental to the job. It's like thinking mechanical engineers were all going to lose their jobs because CAD came along, after all, didn't they spend their days at a drafting table?

You keep mixing up the technological toolset of the job with, like, the actual substance of the job in this frustratingly reductive way.

I'm sympathetic, a bit, because there was this period of time where software engineers got a little high on their own supply and convinced themselves and everyone else that "coding" was this God-like skill set that was just *different* than everything else, and so I don't mind seeing them taken down a peg. But fundamentally coding was never engineering.

Sense's avatar
3hEdited

To say nothing of the fact that in most organizations most engineers stop actually using these tools (drafting in CAD, coding etc.) once they become senior enough. Generally at that point the job has less to do with production and more to do with design, coordination, tradeoffs etc.

Not that LLMs aren't also getting better and better at that too but it will likely take a bit longer and always be a bit messier.

edit: On the "learn to code" front. Once thing that became clear to me as a hiring manager in the industry was there was an enormous difference between people who "learned to code" and engineers. One does not make the other.

Has nothing to do with their academics either. In my experience there were plenty of people with non-STEM degrees who "learned to code" in a bootcamp or via YouTube or something and ended up brilliant engineers because they had, at their core, an engineering mind. Likewise, plenty of folks with engineering degrees who at their core do not.

PaulD's avatar
2hEdited

As an engineer, I agree with you somewhat. Although I think Noah generally has a good handle on things. Until I actually started working on a Masters in Computer Science (which I did not finish) I had real issues with the term “software engineer”.

Pittsburgh Mike's avatar

I think this is mostly right. In an AI-heavy environment, programmers may spend more of their time doing planning and architectural design than coding.

But I have to say that I'm more skeptical of the overall benefit of AI coding tools. I've only tried to use Gemini and for toy applications, and it didn't do very well writing code from scratch. It could toss out some excerpts that did some of the functionality, but FWIW, the architecture for the application was all wrong.

What it could do was relatively simple debugging. When I was playing around with an Internet Radio player, I could upload code, give a loose description of what I saw ("I see a crash at line XXX of this file when I press the "+20" button" or "I hear noise every few seconds when I'm playing a station."), and it found the problems near instantly.

https://siddhantkhare.com/writing/ai-fatigue-is-real describes at a level of detail I hadn't seen before what it's like to program by prompts. It sounds horrible, frankly: AI generating different approaches when the same prompt is used at a different time; having to check the generated code after each iteration; keeping up with tools that change from week to week; having to puzzle out how the generated code works, since if something goes wrong you, not the AI, are going to get that 3 AM phone call.

jeff's avatar

From my experience, it will quickly reproduce common apps that have been made a million times. It will do fun ittle projects.

But if you're (1) creating a *software product* that will need to be extended and maintained or (2) doing anything fairly bespoke, you can't have it produce a bunch of black box code and hope for the best. You need to keep your hands on the architectural reigns, laying out exactly what you want and letting it save you some time by filling in snippets. It's that process of definition that's connected to the entire motivation for the project in the first place. To have AI do it, you still need to explain it to AI; it's circular.

This is an incredibly useful tool. It's not a magic genie that ends engineering.

Erik Mattheis's avatar

“Software engineers, for whom ‘writing code’ was a big part of the job description just a few months ago, are now mainly checkers and maintainers of code written by AIs.”

This gets a lot of press and as a software engineer it drives me a little crazy every time I see it. Even if we accept that the coding part has been or will be handled exclusively by AI, the engineer’s actual role hasn’t changed. The ability to competently produce working code is still in the job description and I would argue that it was never the most important part in the first place. Using AI doesn’t turn a software engineer into a “checker and maintainer of code written by AI” any more than pneumatic nail guns turned carpenters into checkers and maintainers of nailed together pieces of wood.

jeff's avatar

Many people are saying!

Peter Defeel's avatar

This does sound horrible. It’s also unlikely. More likely is the code reviewer or maintainer (to stick to coding) is a specialist who is needed when the AI messes up. I’m fairly dubious that coding has been replaced fully really - only the companies selling the AI are saying that, with the results we see from Claude recently.

Programmers aren’t, except for trivial apps, writing prompts like “build me an app to show a dog barking”, but instead getting AI to write what they could write themselves, with strong architectural inputs and frameworks and system knowledge.

More like pilots and auto pilots.

Bryan Alexander's avatar

For education, this suggests a stronger emphasis on interdisciplinary study. Liberal education.

Kathleen Weber's avatar

I think that basic level REAL courses in the sciences are part of a liberal education. The scientists of 17th and 18th centuries certainly thought so. Even though I got my PHD in history, I would feel naked without my basic understanding of biology, chemistry. and physics.

Kenny Easwaran's avatar

The humanities and sciences are the same here, both focusing on general knowledge and skills, rather than more career-oriented majors that try to teach you skills for a specific job.

Kathleen Weber's avatar

Where do you live?

Bryan Alexander's avatar

That's the sense of liberal education I have in mind: across the disciplines, and not just the humanities.

Kevin Z's avatar

I'm going to help all the "but now you don't need to learn to code" people (who probably studied liberal arts). In the age of AI we will still need people who "know how to code", or stated more precisely: who study computer science. I think we'll need more of them.

What non-tech people don't seem to get is that Computer Science is at its core studying the way that problems are solved using computers and problem solving in general. Modern coding itself is done on layers of abstractions. Java programmers are writing code that is compiled to "byte-code" which is then translated again into machine instructions. AI is just another tool of abstraction. Computer science education will of course evolve, but it is far from redundant.

Want to be useful in the age of AI? Study Computer Science.

Conor's avatar

I dont see any mention of the service industry here, what do you expect to happen to stockers and fast food workers? Or the trades? Is half the American economy just not worth a mention?

Peter Defeel's avatar

When people are talking about job replacement with regards to the present AI systems, it’s Office work. If you are using your legs, and body to work you are safe, at least from the first order effects.

Snailprincess's avatar

I'm a software engineer. I can only speak for myself and what I see in my own company and team. But at least where I am we're a fair bit off from software engineers no longer writing code themselves. Code gen tools are definitely impressive, but they're still not super great at making large changes to established code bases themselves. At least not with extensive prompting that's as time consuming as doing it yourself. They may very well get there (and sooner than some think), but the transition hasn't happened yet. They're better at stand alone prototypes and for certain boilerplate implementations (tests, etc).

But beyond that, the actual writing of code (as in the actual typing lines of code into an editor) has never really been the lion share of the work of a software engineer. A lot more of my time has always been spent determining what needs to be built, how it will interact with things other people are building, and how to build it to be debugable and editable by others in the future. Much of the rest is spent evaluating how things are performing, debugging etc. AI is making inroads into SOME of those spaces, but I think it's quite a bit slower there.

I suspect some massive changes are on the horizon, though it's difficult to predict the scale and the timeline (6months? 2 years? longer?), but at least where I'm sitting the change isn't dramactic YET.

NoahpinioNOT's avatar

The salaryman analogy is interesting but the institutional gap is the elephant in the room. Japan's system rested on lifetime employment guarantees (30-35% of workforce), seniority wages, enterprise unions, and government retention subsidies. American median tenure is 3.9 years vs Japan's 12.5. The "no-hire no-fire" economy isn't salaryman culture taking root; 76% of employers cite difficulty finding replacements as the reason they're hoarding (Indeed, 2024). That's pandemic trauma, not a new labor model.

Also, the Humlum and Vestergaard study is excellent but Denmark spends 20x more per capita on active labor market policies than the US. Citing Danish outcomes as a forecast for American workers is a stretch.

Wrote a longer response with FRED charts here: https://noahpinionot.substack.com/p/salarymen-without-the-safety-net

Scott Williams's avatar

There are also those of us who will protect our jobs because we can. Bars, in particular, will preserve jobs in fields like Law and Medicine.

M. E. Rothwell's avatar

I would have thought bars would have preferred to preserve jobs in fields like bar tending

Mary's avatar

I feel like we should probably actively try to prevent this terrible future from taking place!

Greg Steiner's avatar

I managed projects for years and you always had budgets (people) and timelines as nearly fixed constraints. Scope was typically your variable. If AI can allow us to deliver more in the same time and same budget, then it is a win for everyone.

Also, hard working smart people who show up on time and aren’t distractions to others will always be able to find work. They always have. Maybe we encourage people to develop those traits. Seems like we encourage the opposite.

Joe's avatar

Feels like a lot of whistling past the graveyard. For one thing, the assumption that "AI" will not improve faster at the skills required of "generalists" than the labor market will respond with a new category of salarymen seems highly suspect. The problems of managing around "jagged" AI skills seem eminently addressable with additional AI, specifically trained to interrogate, challenge and correct first pass or "confidently erroneous" answers. Humans provide error correction in organizations as a function of hierarchical review and revision. It's not at all clear why "AI" could not do the same.

PaulD's avatar
2hEdited

I am still getting lots of inquiries from head hunters for controls engineers. There has even been a bit of a flurry in the last couple of days.