68 Comments
Jan 15, 2023Liked by Noah Smith

This previous post by Noah is one I keep thinking about more than most...especially the history of electricity adoption which I wasn't familiar with.

In my job, we deal with construction plans and it took 25+ years, after it was possible, for whole industry to go completely electronic. Engineers were still Fed Ex paper plans 10 years ago.

I think the medical/health care industry is another example of taking forever to get digitally efficient due to privacy, older staff resisting change, costs of changing systems etc.

Even when the tech already exists, the lag of it taking over can take so long in some sectors.

Expand full comment

Agree about the technology, healthcare and the road blocks that are consistent and definitely cause a lag. And with many medical scholars, they have learned and achieved so much that they are used to things done at their direction and personal preference. But one area I'd love to be part of building would be the use of VR for nursing and medical students. The use of VR to practice or use scenarios for learning & diagnosing medical conditions for physicians. Or to train nurses how to identify the best nursing practices to utilize would be an amazing addition to any medical college program. Or would have been when I went through the nursing program at Kent State. I am thankful for the internet and all the glorious technology that's evolving with it. But agree, sure takes a bit for it to catch up everywhere.

Expand full comment

One problem with medical diagnosis is that doctors often pick up a lot about a patient in the same room because they change their point of view easily and unobtrusively. It isn't always the primary element that reveals what is going on. It's the overall 3D gestalt, especially for watching movement, breathing, attention and so on.

Expand full comment

Great point Kaleberg, I agree, there are alot of factors that are difficult to code into 1's & 0's into a software program. Those factors that our other senses, feel or pick up on and come into play that are not easy to make black and white to put into a software program. Agree, definitely 4-D.

Expand full comment

I still occasionally get drawings from one old gentleman at a subcontractor who clearly hand draws the plans before being scanned in.

On an ongoing note, I see BIM being used on a few more projects, but not often on smaller ones.

Expand full comment

It takes quite a few years before enough parts of the puzzle are in place to be the catalyst for real change. We have a couple of activities that are only now becoming digital all the way form initial data collection to end use. 40 years ago these activities were done with transcribing handwritten using a typewriter on paper. 20-30 years ago that changed to transcribing those handwritten notes by typing them into something like Excel. However, the data was still not in a true database that could be accessed by other programs - more transcription to do that.

We are finally now getting to the point where we can start to collect that data digitally in the field on a small, robust programmable tablet. That data can then be uploaded into the cloud either over a cellular network or at night over WiFi internet connection. That data is now in a database, immediately available to multiple programs and users. This is just starting to happen (one of the people below mentioned BIM which is part of the ecosystem where this is happening) but I expect we will finally see a substantial reduction in manhours required for these tasks in the coming decade. In order for it to happen, you needed the Internet, user programmable data collection devices that are small enough and robust enough to be usable in the field, and end user software to make it all happen.

People like me find these shows like NCIS and Criminal Minds hysterically funny where in 5 minutes they are whipping through all of these various data sources and putting things together on some big screen. Maybe the NSA has these capabilities, but usually you spend hours and days just trying to get two database architectures to line up properly and figure out where all the data quality errors and omissions are unless it is a common application somebody has already solved.

Many of these potential applications are small compared to the number of Facebook users, computer game players etc., so there needs to be a conscious decision to invest resources in something you may never have more than a million total users of (may be generous) worldwide. One of the benefits of the recent tech sector crash is that many of the programmers for games on social media (an example) will now be looking for new jobs. So the companies looking to develop these specialty uses will be looking for staff which should now be more available. They probably won't have pool tables, free cafeterias, and dog-walking though.

Expand full comment

Of course. IBM already had a computer (and the software) that could out-diagnose any doctor before the year 2000. It is obvious that they could have mass produced them and reduced costs by about 80% (at least). Then there is all the administrative stuff you mentioned that they drag their feet on to digitize and speed up with AI in order to conitinue "Health" Insurance Corporations' "justification" for 20% PLUS ridiculous "administration" costs.

The "Medical" Insurance Complex just does not want to accept the fact that their medical "services" are massively overpriced. If you haven't read this book, perhaps you should:

Culture of Death: The Age of Do Harm Medicine Paperback – May 16, 2016

by Wesley J. Smith

Expand full comment

A really interesting and novel take. It feels intuitively right but I'm gonna devil's advocate it anyway.

WFH: my feeling is that this doesn't necessarily increase productivity so much as increase the variability in it. I've been WFH (software dev) for nearly a decade now and they've been some of my most productive years, as acknowledged even by my management. So I was naturally pretty positive towards it and felt strongly like it's got the great benefits described in the article.

COVID has changed my views on this quite heavily. I keep meeting people who "work" from home but will happily admit they hardly do any work and are scamming their own employers. They'll work a few hours a day and spend the rest of the day watching Netflix, sleeping, etc. Others work two jobs. Others set up and run their own company on their employer's time. The frequency with which I encounter this behavior without even looking for it, and the total lack of shame that goes with admitting to it, makes me think that there are good reasons why executives try to force everyone back to the office. For those of us lucky enough to enjoy our work we'd be doing it anyway so WFH feels like a pure productivity bonus. But for the clear majority, work is just work, it's not enjoyable and if not closely supervised they'll slack off.

All this makes me think that WFH will lead to a noticeable drop in productivity, and the execs are probably right to try and fight it. Yes people who are naturally productive and happy at work may get an hour or two more per day (best case, lots of people don't have commutes that long). But you'd need a whole team of them to offset the loss of productivity of just one person who phones it in and then spends most of the day checked out on the sofa.

The other thing on my mind is wondering how really this productivity is calculated, and how robust that calculation is. WFH fraud happens because companies struggle to understand the output and productivity potential of their employees. How can economists measure this so accurately when institutions seem to find it so hard? Noah mentions surveys asking people how much they work, but there's also the output side to consider. We're sort of taking it as read that the productivity problem is with how people use new tech, rather than something wrong with the measurement. I mean, I for sure feel like I'm more productive because of computers - my job wouldn't exist at all without them! That's like an infinite productivity boost, right? Well they can't actually sum it that way, so how does it count?

Expand full comment

Wow, this is mind-boggling. I think this highlights how important it is to increase the skill level of managers in this new WFH age. This has often been the case in tech, because great technical people often are horrible managers but get put in those positions via a promotion. Without mindless staff meetings and micromanagement by walking around, managers will have to build actual relationships with their staff and find ways to motivate them and monitor their progress. If they watching Netflix or working their side gig themselves, they won’t take the time to do this. Poorly managed companies, and there are a lot of them, will struggle and want to bring people back in the office.

Expand full comment

That's the sort of goldilocks ideal approach to management, yeah.

I've been on both sides of this. I've managed small teams, bigger teams-of-teams and been an IC. Even with really cool and interesting work, most workers just want to turn up, get their salary and go home. Nothing wrong with that. The ones who are intrinsically motivated are the minority, and it can easily be lost if a decision doesn't go their way.

And that's in software where there's huge possibilities and intellectually stimulating work. People routinely do it for fun. There was a story recently about an accountant in Canada who was defrauding her employer. They caught her using monitoring software, she confessed, they fired her and then she sued them for wrongful dismissal! Luckily the judge tossed her case and made her pay legal fees. It's probably really hard to create intrinsic motivation for accountants. Nobody does accountancy as a hobby, right? I have no idea how you motivate them and monitor their progress when WFH except by using that sort of surveillance software.

In an (open plan) office, it's just harder to slack off. If you're watching Netflix lots of people can see your screen, it's gonna get noticed fast. If you're at home, nobody can see. Getting managers to fire low performers is a nightmare even in times of crisis, if the company is doing well you just have no chance, so central monitoring and control of WFH is likely to be the norm in future and then employees will complain they're more heavily monitored at home even than at the office. All the arguments about this are consuming huge amounts of time and energy which must also be a drag on productivity stats.

Expand full comment

There are people who have an intrinsic love for accountancy. My wife is one of them. She had a career in forensic accounting, and took a pure joy in finding where the fraud was hidden in some company's books. There are more accountants like that than you imagine.

Expand full comment

I can believe that for forensic accountancy, absolutely. I think that's pretty rare compared to normal accountancy though?

Expand full comment

Forensic is a subspecialty, for sure. But numbers-loving obsessives are found in regular accounting too. Through my wife - thus adjacent to the profession - I've met them. People who love to see things balance; who love a beautiful presentation of the results. Of course I have no statistics on the distribution of such people throughout the profession.

Expand full comment

Maybe part of the societal reorganization will be safety nets (medicare for all, UBI, etc) that make it less necessary to have a job, and thus easier to fire someone who doesn't really want to be there in the first place.

Expand full comment

Those safety nets already exist. Yes governments make it hard to fire people at least in Europe, but the problem is more often on the management side. The average person would rather have their fingernails pulled out than fire someone on their own initiative for just not working hard enough. Even at big US tech firms that pay very well getting rid of low performers has to be wrapped in huge amounts of process and second chances, and it has to be triggered semi-automatically (by perf reviews), because otherwise managers just won't do it.

That's why execs hate WFH. They know that if they plop an employee in an office where that person's working state is easily visible, they'll get at least *some* hours of productivity out of them. Maybe it won't be 100% but some work will get done, and they won't have to watch too carefully. If someone watches Netflix all day, the manager has seen it with their own eyes and can say "you were slacking". If they can't see someone directly then they're going to suddenly have to fire someone purely on the basis of insufficient output, but that's really hard and subjective. Especially if it's a woman or black person or both then you're gonna immediately get sued for racism or wrongful dismissal or whatever, and if the cause was just generic lack of output your position is much weaker than if you have direct evidence of cheating. The Canadian firm got sued for wrongful dismissal even though the employee got caught red handed and even admitted to it on video!

Expand full comment

I think WFH during COVID has revealed how little a handle managers have on what their company and employees are supposed to be doing. If someone is making shoes, you can just count to shoes and check the quality. If someone is writing software, no one has a clue. Is the coding coming slowly because the problem is very hard or because someone is slacking off? How can you tell?

Expand full comment

What is your source for your claim that a 'clear majority'... will 'slack off'? Is this purely anecdotal or are you referencing studies? My anecdotal experience across numerous industries, is the exact opposite.

Expand full comment

It’s entirely anecdotal. Never trust anyone, online or off, who tells you that other people have opened up to them about being lazy, unproductive or scamming the system. That’s not likely behaviour.

Expand full comment

I rely on facts and data, not anecdotal experience. There are numerous reports like this one: https://www.techrepublic.com/article/deloitte-report-finds-employees-more-productive-during-pandemic/

Expand full comment

I didn’t mean to say you were being anecdotal but that the guy you were replying to was.

Expand full comment

Thank you, agreed.

Expand full comment

There are no facts in that article, it's just a pile of anecdotes and hot takes - quite possibly a paid advert for Deloitte judging by the disclaimer at the top. It even starts by saying that before COVID they didn't even know how to measure productivity so, by implication, they can't really compare it.

Expand full comment

Oh to have lived in such a blessed world. I really gotta laugh at the "academic study or it doesn't happen" brigade.

There is very little social stigma involved in getting the upper hand over an employer, especially if the employer is big and rich. Unions are an institutionalization of that very social belief. Doubly little stigma if the employee doesn't feel their work is actually appreciated or important. Pretend it's not happening if you like, but I've had two people admit this to me in the past two weeks alone and again, I'm not asking them about this directly! It happened to come up in conversations about work. Yes, you would hope that people would feel a bit of shame about it, but no they really don't.

Expand full comment
Jan 15, 2023Liked by Noah Smith

"Economist Paul Krugman wrote that by 2005, it would become clear that the Internet's effect on the economy is no greater than the fax machine's."

See https://www.snopes.com/fact-check/paul-krugman-internets-effect-economy/

Expand full comment
author

hehe

Expand full comment

This is Krugman through and through. Why is this surprising? Lol.

Expand full comment

Because he's right more often than wrong. But in that piece, it clearly wasn't as much thought put into it.

I mean, a prediction in an article about how economists predictions can often end up wrong? Vs supporting capital controls in the Asian crisis, a measure lambasted by the IMF and most convention economists? Or that fiscal stimulus wouldn't end up with hyperinflation in the GFC?

Expand full comment

The idea that American towns/cities are going to either try to attract workers or plan for more services implies they tried to plan for how anything currently works, which seems kind of wildly optimistic. American city planning is mostly about actively hurting yourself via NIMBYism in hopes it'll stop anyone else from moving in. Can't even have a corner store in your R1 neighborhood.

There will be even more pressure on larger housing due to this though, because extra bedrooms can be used for home offices. And if you work from home, you've now found a way to use your home much more productively - speaking of NIMBYism, residential zoning was literally invented to stop immigrants from becoming richer by being able to do this. But home programmers are a little harder to ban than home Chinese laundry services.

Expand full comment

Bigger productivity drivers to me would be:

---Medicare for all: do away with thousands of government subsidy programs (Ie Medicaid, Obamacare, etc., maybe even Workman's Comp medical portion) and eliminate a workforce of paper pushers in the private insurance industry. Plus making it all cheaper.

---Modernize the tax code, simplify, eliminate loopholes for the rich.

---Emphasize train for freight instead of trucking which is tearing up our roads without a commensurate gas tax to compensate.

---Get corporate America out of Washington and take the current crop of Congress with them (yeah I know, good luck, lol).

Expand full comment

- Medicare for all: who believes grouping all HHS bureaucrats into one new department would make them more productive than private insurance company workers in a competitive marketplace?

- tax code simplification - better to get rid of all loopholes as well as social engineering with tax credits, but start with the mortgage interest deduction, the SALT deduction and employer provided healthcare deductions

- trucks don’t pay gas taxes, but do pay diesel fuel tax, tire excise tax and weight taxes. I do see that at least in the west, weigh stations have become more automated which improves productivity. I don’t think logistics companies try to avoid freight trains over trucks other than for cost or schedule reasons

- agree on getting the current congressional crop out of Washington, as well as rent-seeking by corporations, but letting Congress make regulations and industrial policy without input from corporations would create a huge decrease in productivity

Expand full comment

As a software developer, designer, and architect, I've lived online longer than most. Online productivity increases come from several directions. The most basic is the elimination of commute time. My gut feeling is the pandemic moved us 60% of the way to its potential benefit.

Another area is instant information access. This is much harder to assess because it depends on individual habits. Folks complain about the distraction of the network, but a disciplined person looking stuff up instead of relying on vague and faulty memory is much more productive.

A third area is refactoring work. Many delegation decisions used to be based on who was in physical proximity. A shift to who is most prepared to execute increases productivity.

Yet another vector is the more rapid dissemination of expertise.

Other areas in which the diminution of the necessity of physical proximity makes more efficient exist, perhaps as yet unnoticed.

These are complex and slow-moving phenomena that will take time to become significant.

Expand full comment

I agree with all those, I also worked as a software engineer from home for over twenty years but retired in 2019 so didn’t see the pandemic changes. Another productivity boost I got was that before working from home many workplaces switched from offices or cubicles to open plan workspaces. This was supposed to enhance collaboration (and reduce real estate costs) but because of distracting conversations and phone calls most developers wore headphones so collaboration actually went down. Many thought it was more uncomfortable to approach someone coding with headphones on than as before to walk into their office or cube. Working from home I got more done, and once Slack and MS Teams came in got better collaboration than being in the office, although I did miss out on lunch and break room conversations. I think a virtual break room would help for this in WFH situations. Also we worked on TV software so you couldn’t assume that someone who had Netflix open on their monitor was slacking off.

Expand full comment
Jan 15, 2023·edited Jan 15, 2023

Two elements of note:

1. The gig economy is anathema to governments that exist to take your money. Independents (not talking about employees at home) are likely to have many side gigs and to be paid in a variety of ways. In almost every blue jurisdiction, the move is insistent to make these otherwise newly productive folks just like employees with union bosses to waste money, dead time that cannot be used for anything, work to rules, and all the rest. This will give a substantial advantage to the red states in moving down this curve. People abandon California, Illinois and New York for just these reasons.

2. I have spent my life in healthcare IT -- In fact I wrote perhaps the first paper on this subject as the lead article for Science (at their request) in 1980. (Yes, it is that old, sadly.) There are a number of health-specific issues that impact the woeful lack of progress in this space. At some point I will write extensively about this -- it deserves its own long-form piece. But most fundamentally, financial systems took 50 years to really fall into place -- and those systems have exactly ONE data type (The dollar or some equivalent) and everyone agrees on what it is. In health care there are thousands of data types and it is unusual for any two people to agree on what they are. (Unless they just force compliance/lying as was done with the covid disaster.)

The IT solutions in healthcare are virtually all still billing-system centric (even if they say otherwise). Most individuals have many disparate providers of care, each with their own mental view of what an individual constitutes. NONE of the systems in popular use are able to reconcile these -- the solution is to just sort and dump the data for doctors and nurses and other health care professionals to figure out, over and over, for each patient every time. Most of the IT has just increased the stack of things through which one is supposed to sort and process, without impacting the effectiveness of the process at all because the underlying systems do not UNDERSTAND the data and therefor cannot be largely assistive.

Health care automation has gone virtually nowhere in the past 50 years. There are still fundamental issues that are woefully ignored (they are hard -- at the edge of Description Logics ability to handle). So as a betting man, I would not be looking to that sector for at least another 25 years for any automation to make a meaningful difference in either efficacy, quality, or efficiency.

Expand full comment
Jan 15, 2023·edited Jan 15, 2023

Does anyone abandon California because of this? I think it's mostly housing costs.

Certainly I've never waited for an Uber anywhere in the state - once waited hours for one in Ohio, but that's not exactly strong evidence.

Note Californians voted for Uber's own regulation prop they wrote for themselves.

Expand full comment

I think modern IT technology can easily improve health care delivery if utilized intelligently. My father, a Stanford educated Pediatrician on the SF Penisula, read medical journals for hours every week (almost all when "working from home" when on call; he and his partner shared call 365/24/7 for over a decade). Today an MD can have far more focussed reading of the literature on an iPad than the medical journals that filled his office and parts of our home. A modern information worker (I'm including MD's in that broad citatory) has incredible tools to obtain and manipulate information/data that were not possible before.

Expand full comment
Jan 15, 2023·edited Jan 15, 2023

John,

That is in part true. But as someone (hematologist) who has been reading journals for years (and who got a lifetime subscription to the New England Journal of Medicine at the (still exorbitant) medical student rate (they write me once a year to ask if I have died yet...) this is a two edged sword. The volume of things to read has burgeoned. Being focused as you suggest may increase the chance that you do not miss one of those articles in an area of special interest to you. But what you miss on the Internet is the "almost accidental" exposure to an article you did not know you needed to read until you saw it and it captured your interest.

For instance, I read extensively in hematology and microbiology and immunology, but there are important things in other areas that have been of great help to me where I might not originally thought of looking. So this tool gives you enormous new capabilities, but also means you likely lose some that cannot be replaced.

But my point is not in the general information retrieval which, as you note, has been substantially enhanced over the past 25 years. My point (and I know it is subtle -- just did not want to waste too many people's time) is that taking the words and numbers in a patient record and making them machine readable has meant very, very little to making practice better for a long number of reasons. The default, used virtually everywhere, is to build a giant data dumpster for each patient, throw all the data that you might find (still usually missing some, often quite important) into it, and then letting each health care professional rummage around in it to see if they find something important. Often important things are missed. Building the data dumpster faster is not helpful. Because every source of data has its own underlying model (there is no "dollar" to standardize things) and because each patient is unique (this is, after all, the automation of the biologic) the tools are astonishingly ineffective. So we are still manually drawing the conclusions on each patient based on whatever data we happen to see/find. The real problem is that this is NOT really an information technology issue. This is a Medical Informatics issue, and that is a very different sphere.

As I said, a very long way to go to get any kind of improvement in this sector. And, not surprisingly, thousands of articles show virtually no benefits to electronic systems in health care (when you cut through tiny limited academic cases and/or vendor-skewed reports). Every practitioner I know wishes for better -- we are just a long ways away, still.

Expand full comment

“Remote work could allow companies to distribute their workforces to low-cost locations”. Great , meet the new outsourcing boss, same as the old outsourcing boss. It’s marvellous that having exported manufacturing until recently, the capitalist classes now want ti export whatever services they can. Good for productivity, bad for wages in rich countries. Same old story.

Expand full comment

It’s not exporting services, that would be something like how many people in the world use Microsoft cloud apps or Google ads, it is instead importing labor. And when importing labor it raises the wages over what those workers would get from local employers and because of competition those employers will be incentivized to raise their wages also.

Expand full comment

Today 1/15/22 the WSJ has a fine article updating 3D printing adoption and advances. Sounds like a lot has happened many with implications for on-shoring, speeding prototype production of complex parts with a sand deposition technique, provisioning out of stock or rare parts and more. One can almost imagine a 3D printer on an aircraft carrier repairing fighter plane parts, and all kinds of plumbing without going into port or at least greatly speeding turnaround and battle readiness.

The 20s may bring more than internet based productivity gains.

Expand full comment

3D printing is very limited.

Expand full comment

Did you read the article or do you find it necessary to school people while you offer little?

Expand full comment

The WSJ article? No. 3D printing has been the Next Big Thing for two-three decades now and every so often there’s a surge in belief again. Then that dies down again.

Expand full comment

Should she fax it to you? Working from home over the internet was the Next Big Thing for decades but now 30% of people in the US now do it.

Expand full comment

On remote work -- one of the challenges of remote work is to balance individual interests with that of the larger organization. In a world where work often involves collaboration across organizational boundaries a certain amount of physical face to face contact is needed to build trust. Remote work can save companies money in terms of office space, computers (byo) and save employees money in terms of commute and where they live. I would add that in the arly 2000s I worked in Southern California while my boss worked in Texas. During that time he would usually find a reason for me to fly out to Texas at least one week/month). So travel costs need to be added into the mix.

Expand full comment

I remember when the original came up, it was a time of great optimism. Three years later, the pandemic, inflation, and the wars seemed to have sapped most of that optimism... but I still think that you were right, Noah, and we might just have a roaring 20s after all

Expand full comment

I'm not sure if there really is a net productivity boost from WFH, but I think we will know more over time. Many CEOs seem to think there isn't one: https://www.google.com/search?q=back+to+the+office&source=lnms&tbm=nws

Expand full comment
founding

Most companies that can will go to a hybrid model, which won’t allow too much moving to the Philippines.

Expand full comment

Hit the ball out of the park with this one.

Expand full comment

If there weren't a catastrophe brewing for industrial civilisation and human survival, this'd be great. Maybe the shift needs to be to support a different direction.

https://www.scientificamerican.com/article/the-delusion-of-infinite-economic-growth/#

https://www.youtube.com/watch?v=YnEXEIp5vB8&ab_channel=CanadianAssociationfortheClubofRome

Expand full comment

Noah has a lot of posts explaining why these sort of arguments are wrong, I recommend you to search for "degrowth" on his substack

Expand full comment

Scientific American is an increasingly joke publication.

Expand full comment

As is the Club of Rome

Expand full comment

They did a pretty good job of predicting things up until now. I can only assume you have a very dark sense of humor.

Expand full comment

We see the Internet everywhere but in the productivity statistics.

Expand full comment