113 Comments

I'm in the 1x 10‐⁸ of the heterosexual meet your spouse category. Maybe alone. I met my wife on the side of the road. Fixed her car. Nighttime. Married her 9 months later, 46 yrs ago.

I still have my 1950s one transistor am radio I used to hear the World Series.

Being an early 1970s tech nerd at 3 letter geek school in Boston, I see the smartphone as they key leap..I think you highlight screen time. That's me now!

But for me (69) and my wife (73), the access to information, to learn things, to satisfy curiosity, is fantastic! We have thousands of books and still research and read from them. Turns out not everything is on the internet.

Expand full comment

Excellent article! The one thing I’d quibble on is this line: “Wisdom and know-how were profoundly valuable personal attributes. Now they’re much less of a reason for distinction.” I think if anything, the explosion of ubiquitous information has placed a *premium* on wisdom over superficial facts.

You provide a few good examples of how experience can be superior to online activity (ie learning violin), but I think it goes deeper than that. The critical skill is no longer just...knowing a thing, but rather how to connect it to the 8 million other facts and narratives fighting for attention. The ability to construct meaning from data is what people increasingly crave, and I think you can see this with the rise of Vox explainers, Substackers who focus on technical areas, and subject matter experts becoming Demi-celebrities on Twitter and tik tok.

A great example of this was during covid. There was a glut of information, even in those early months, but they were all disconnected fragments and people struggled to make sense of it. A then-unknown genomics professor from the Hutch in Seattle named Trevor Bedford was able to rise to prominence by not just posting his own data and other research, but crafting long, methodical Twitter threads that walked lay people through the data in a clear, fun, and non-inflammatory way. He later won a MacArthur Genius award, and while he’s clearly a bright scientist, I highly doubt he would be on the committee’s radar if not for his public science communication work.

Similar things seem to be happening right now with AI. When every person on the planet can use it to become a decent writer, the truly great writers will stand out above the sea of uniform mediocrity, rather than a bell curve of quality. When anyone can ask ChatGPT for factoids about something random like microRNA or particle physics and get a B-minus answer, the people who can weave a story, and more importantly use the data effectively, will become in high demand as consultants, speakers, and employees, while other knowledge workers that can’t rise above the crowd are at risk of automation.

Expand full comment
May 15, 2023Liked by Noah Smith

This is all true. My grandfather was born a few months after the Wright brothers made their first flight and drew his state pension the year of the first moon landing. He lived through two world wars and the depression. But I still think he had an easier time of it than I did because he didn't have to learn how to use computers in the 90's - we lost so many! Computers hang, and people with them.

Expand full comment

This brings up another point, which I read recently in a discussion about chip fabs: technology benefits from multiple feedback loops. Improvements in materials science result in finer precision chip fabs, resulting in cheaper computers, resulting in more ubiquitous computing, resulting in better chip verification techniques and better engineering education, resulting in progress in materials science. With feedback loops and spinoffs from all of these stages and others.

A simple thought experiment shows how important this is: imagine you went back to the foundation of the Republic: 1776. What parts of modern life could you actually teach your new neighbors?

For me, perhaps primitive anesthetics (ether), and about germs and sanitation, all of which would be invaluable, but wouldn't get you to a world remotely like 2023. I know that penicillin comes from bread mold, but I have no idea how you isolate it, and I don't know where any other antibiotics come from. I doubt I know enough about how to create a polio vaccine to be of any assistance. Birth control changed our world, but I can't tell you how to isolate estrogen or use it in a pill.

I could probably teach them how to build an electromagnet and maybe an electric motor (and a dynamo is just a motor run backwards), though it would be iron-based since I don't remember which metals go into the really strong magnets we have today. I could leave hints for later generations about what a transistor looks like, but it I don't know how to actually create the materials required. I could probably draw the basic outline of a refrigerator, but don't know enough mechanical engineering to describe how to build a compressor, much less a reliable compressor that doesn't leak freon. And of course, I have no idea how to make freon, or more importantly if we're starting in 1776, an ozone safe coolant. No way could I explain GPS, how to build a rocket that doesn't explode, or how to put a billion transistors on a quarter inch square piece of silicon.

Etc, etc. Most of technology builds on a base built by previous generations. Today's ubiquitous computing, better understanding of cell biology and genetics will lead to other technologies we can't really see today, but you can see glimpses of in mRNA drugs, CAR-T treatments, and CRISPR just in the medical arena.

And I'm skipping over all the changes from computer-mediated communications. I work for a Gang of Four (no, Facebook doesn't count) tech company, remotely from, you can probably guess, Pittsburgh, in a role that would be impossible to perform without cheap broadband and cheap cloud computing. Everyone I work with is at least three timezones away, and it pretty much doesn't matter.

I can chat with anyone in the world, although only in English, or in French of such poor quality that it would convince a French speaker who initially denied knowing any English to surrender.

I can lookup how to fix any appliance by just googling the model and a short description of what's wrong. I've fixed my refrigerator's ice maker and built in computer, extending the fridge's lifetime by a couple of years, and which AFAICT, essentially just shows up in economic stats as *reducing* GDP, since there's no maintenance call nor a new fridge purchase. Similarly, there are a number of simple plumbing jobs that I've done by watching youtube videos, and which also show up as a hit on GDP.

TL;DR there are lots of feedback loops in productivity in today's modern world of the future, and you often can't see where something will pay off until many years later. But just because it's hard to see doesn't mean it isn't real.

Expand full comment

First space-family and friends

Second space-work

Third space-church, taverns, Starbucks?

Third spaces serve to ground us in lived reality and are especially needed after a bout of disorienting social media. Technology has served to weaken or destroy our third spaces. Vertical spaces will never be an adequate third space. Mental illness, deaths of despair, and weird egregores have been the result.

If you have a third space- hold onto it.

Expand full comment

I spent most of the weekend "playing" with ChatGPT. I have given my kids (age 14, 15, 16) access to it as well. That may not sound weird to you, but we home educate our kids, they have no smartphones, no social media, and we filter their Internet access. So for me to green-light this tool is a sign I think it's VERY important. In fact, I've even given them an essay assignment for next week (compare Roosevelt's 4 Freedoms with those listed in the Declaration -- it's a fairly standard American Civics assignment) and told them I want them to use ChatGPT to do it.

How does that sort of machine learning collaboration work? Here's a specific example from this weekend...

Among other things, I teach Lego robotics at our homeschool coop, and one of my students is building a drawing robot (essentially a plotter). The native Lego programming language can't do what he needs, but it's possible to program Lego robots in Python, which is a different language that I am not particularly good at. Since he doesn't know it at all though, I told him I would rewrite his program in Python and also write a program on the PC to generate the sorts of data files that he needs to run on his plotter. This was going to eat 20-40 hours of my time at least, but it's a cool project and I get to make one of my students happy... so it's worth it. (It's a homeschool coop, we're all volunteers anyway.)

Last night, I asked ChatGPT if it knew how to write Python. "Yes". OK. Build me a simple game in Python. It spat out code for a simple dodgeball game -- in about 10 seconds. I ran the code unchanged; as a game it was terrible, but it did work. Hmm... I wonder...

Can it write Lego Robotics Python code? "Yes". I then spent 10 minutes carefully specifying (in English) exactly what I wanted this student's program to do: paper sizes, motor rotation, coordinate ranges... all of it. The first time I missed a few things, so I added a few more specs. The second time, it generated me about 100 lines of code. It's not done -- that's important -- but gives me enough to start from. My knowledge of this language is good enough to tweak ChatGPT's code, but building from scratch would take a a long time. Best guess, this thing did in 30 seconds what would have taken me about 4-8 hours.

At this point I was pretty excited, so I asked it to create the PC program (also in Python) that my student needs to create the data files for his robot to process. Again, a few iterations of spec time later, I had a simple Python script that could convert SVG vector graphics files (which are creatable in Adobe Illustrator or Autocad) into the plotter files my student needs. Total time: maybe 15 minutes. Time it would have taken me (who has never worked with vector graphics files): a least 12 hours. This sort data integration work is tedious, time consuming, no one likes it, but it pays lots of geeks' bills.

I would estimate I have about 3-4 hours of debugging to make this whole project work. From 20-40 hours of work down to 3-4 hours; last night, ChatGPT made me 700-1000% more productive as a programmer. That is what machine learning collaboration looks like. It's the cognitive equivalent of the steam engine, and I don't my kids to be John Henrys of the world.

I suspect novelists and poets are pretty safe for a long time. We had it write a few poems and stories... my cellist daughter put it best, "There are robots that can play cello perfectly, but there's no emotion -- it's dead playing. ChatGPT writes poems like that.." Logical creatives -- lawyers, data analysts, programmers -- will not fair as well. Most of those data integration geeks need to to find a new line of work.

Tools like ChatGPT are not going to REPLACE these folks, but they are going to make the ones who embrace the tools far more productive, which means there will need to be far fewer of them. And we should all celebrate that, not just because Shakespeare was right about lawyers, but because these folks are usually educated and pretty smart, and freeing up their cognitive abilities to do other things can only be a benefit to us overall.

How to distribute the economic bounty that such a technology will create is going to be the challenge going forward. And it's a big one. I suspect another American economic redesign is in the offing soon (yes, Noah, I'm reading Concrete Economics -- I like it.)

Expand full comment

Agree, but Noah missed one important thing: the drop in price in computing. In 1974, the computer I used, a Honeywell 200, could do perhaps 10KFlops, and cost $20K/month to rent at the time ($13 2023 dollars per flop/month). We won't even talk about how much space, air conditioning, electricity or how many trained operators it required (not included in the rental price), or that I had to travel to use it.

Today, I can rent from Azure 8 TFlops/sec for less than $400/month. That's 260 *billion* times cheaper.

There's probably nothing in engineering or biomedicine that doesn't depend on complex simulations that are unimaginably cheaper now than 50 years ago. I'd be shocked if engineering Covid vaccines, for example, didn't require simulating the behavior of various drugs on the spike protein. Doing gene sequencing itself requires a lot of computation, matching up partial sequences to get a genome.

Expand full comment

This is all pretty much one thing. The smart phone.

Expand full comment

I imagine myself reading a newspaper back in the early 2000s. As I focus in on an article the script suddenly shifts up and down as an ad is placed in the middle of the reading. I refocus and begin reading again, Oh! A pop-up ad with a timer blots out the article. Finally an 'X' appears and I close the ad and try to regain my footing in script. Oh! It's another pop-up box asking if I want to be on the daily newsletter list. Swiping that away I continue, now having completely lost the jist of the reading. Oh! It's another pop-up asking if I want to read in the app or continue as is. Oh! It's another box reminding me it's time to renew the subscription....

How can this be a good thing for humanity?

Expand full comment

Ahhhhh, Alvin Toffler. Natsukashii. The Tofflers were my white adoptive LA parents. (I have a pair of white adoptive parents in SF too). When I moved from California to Japan in '01, the only thing I missed were my frequent discussions with The Tofflers around their kitchen table. RIP Karen, Mr and Mrs Toffler.

Expand full comment

Get this: I was born in 1973 in an advanced country. I lived 2 hours from Toronto, and my community did not get landline phones until 1991. We had 1 radio payphone in town. The few of us who had TVs received 2 channels, only 1 with good reception.

We didn't know what DAY family members would arrive.

I completely agree with Noah: year to year, little seems to change. But the change over the years is staggering. My grandparents experienced more change in their first 50 years (1901-1951), but my era has easily been the 2nd or 3rd most change-ful half-century.

Expand full comment

There was no treatment for HIV when you were young. Very little for cancer. Don’t forget Biotech and Pharma.

Expand full comment

Technology changes and so does your brain. I remember when “stacks” meant diving into remote areas of university libraries, searching for the books relevant to your research. You felt like a mouse tunneling around in a silo of grain. But it gave me something that is lost scrolling articles, ebooks, etc.: cognitive patience. A neuroscientist at UCFS Medical Center was frustrated because she could no longer read books, because her work required that she sit in front of a computer reading/researching. She decided to become her own lab rat. Every morning before leaving for work and every night before going to bed, she read books for about one hour. (Her go-to book was The Complete Essays of Michel de Montaigne (inventor of the personal “assay” or essay. Her ability to read books returned. One term she uses for e-media damage to the brain is loss of “cognitive patience.” Some might deny this effect, but I’ve seen it ruin retail stock investors’ portfolios, reduce coworkers ability to think one-two-three or more iterations ahead when performing a post-mortem involving multi-million-dollar projects or initiatives. I had one recommendation for coworkers: if it’s important, print it out, read it, and keep it in the project file for future reference.

As for getting lost, it’s still possible and an awesome experience. I was briefly lost in the Ecuadorian rainforest. Because I remembered subtle changes in topography (rainforests generally are very flat) and knew how water behaved (tiny creeks lead to bigger creeks, which lead to rivers— the highways of indigenous tribes), reorienting wasn’t difficult. The average person would panic (for good reason, if they lack cognitive patience) and likely end up SOL. I still spend days in a national wilderness (one mile from where I live off the grid), with no cell signal, performing silviculture field research on tree line species. This horrifies family members, because the wilderness experience comes with wildlife amenities: cougars and bears. Caveat emptor: approximately 1,400 people have gone missing, never to be found, and U.S. public lands.

Recommend reading: A Field Guide to Getting Lost, Rebecca Solnit

Expand full comment

The implicit debate between you and Tyler Cowen (or Paul Krugman) boils down to "No, we didn't" vs. "Yes, we did." But you're both missing a key point. The period Krugman looks back on saw a revolution in practical technologies--atoms, not bits. Cars, planes, kitchen appliances, and so on. Your revolution is in the creation, organization, and transmission of information. Equally important, just different.

Expand full comment

I think it’s notable that nearly all the change you mentioned here came post-2000. If you compare 1900-1950 to 1950-2000, I think the former period has more change along nearly all dimensions. From 1950-2000 the big changes were the modern suburban lifestyle moving from a small fraction of society to the majority of it, but all the internet and digital and gps stuff really wasn’t much available to anyone in 2000. Sure, we could Google, but Wikipedia had only just been founded, and even images were a bit slow to come by (though Napster and RealPlayer had just started suggesting the potential of digital distribution of media).

Expand full comment

I enjoyed the article. But "for example, if I actually go to the Matterhorn, I can see it from a variety of angles and in a variety of lighting" seems like a terrible example of real experiences being better than web searches.

I'm pretty confident that in one hour of Googling someone could find more distinct angles and lighting than they could by walking around for the same amount of time. And if you want to dispute that, they could probably just find a point-of-view video of someone walking around, so it's a wash. (Of course, I'm not saying seeing it in person isn't better. It is, just not for the reasons given.)

Expand full comment