122 Comments
Apr 22·edited Apr 22

It's not just in teaching that universities suffer extreme allocation inefficiencies. It also shows up in the difficulties they have in organizing even very basic cross-field or cross-institutional research collaboration. A couple of examples from recent years that stick in my mind:

1. Public health researchers recommend extremely destructive policies, then turn around and say "you can't expect us to think about the economic impacts of our suggestions/demands, we aren't economists". Alternatively, profs like Ferguson write code full of bugs so it doesn't model what they think it's modeling, and then say "you can't expect us to write working code, we aren't computer scientists". Imperial College London even said to the British press "epidemiology is not a subfield of computer science" in defense of Ferguson's crap work.

If you say, but you work down the hall from economists/computer scientists, so why didn't you talk to them ... well academics just look at you blankly. The thought that they might go build a team with the right mix of specialisms to answer a specific question just doesn't seem to occur to them, even though they easily could. This is a big driver of the replication crisis too, because universities are full of "scientists" who struggle so badly with statistics and coding that they produce incorrect papers all the time, and it becomes normalized. They can't recruit professionals to help them because they don't pay enough and they don't pay enough because they allocate their resources to maximizing quantity (measurable) over quality (not measurable).

2. In computer science it's not all roses either. Billions of dollars flow into universities via grants, yet when it comes to fields like AI they constantly plead poverty and that it's too hard to compete with corporate labs. Why don't they pool resources, like the physics guys do in order to build particle accelerators? Again, cultural issues. There's a limited number of names that go on a paper, a limited amount of reputation to go around, and pooling resources means most researchers will toil on unglamorous work. So they prefer to split up their resources over gazillions of tiny teams that mostly produce useless or highly derivative "me too" output.

The result is that universities *appear* to be highly productive and important, if you measure by their preferred metrics like paper production. If you measure them by actual real world impact of implemented research outcomes, they are well behind and falling further.

China is no better by the way. They produce more papers than America does now but nobody reads them, not even in China. Way too much outright fakery.

Expand full comment

Former Googler here. Competition is definitely part of the picture, but scale- and corporate age-driven culture change is also part of it and would have happened anyway.

Google in the 2000s and 2010s had a very, very university-like culture. That was partly because the founders came out of grad school and hired a bunch of academics, partly because the smartest and most productive people of that time appreciated the non-material returns of a collegiate atmosphere, so pairing that with a fat paycheck made for an exceptionally good recruiting pitch. And as we know, collegiate culture has upsides and downsides. Allowing for open-ended exploration that can lead to amazing innovation is part of the upside; tolerating full-of-BS entitled activists is part of the downside.

In the late 2010s senior execs began systematically eroding the good parts. 20% projects got less respected and less common, internal informational openness got restricted in response to leaks, and the performance review/promotion cycle led people more and more to do box-checking things instead of innovative things. Pressure from competitors (and poorly executed fear responses to competitors) was indeed a major motivator for these changes, but I think some change like this was probably inevitable as the scale of the company went from ~10K to ~100K. It's just too hard to weed out freeloaders, rest-and-vesters, leakers, and other abusers reliably enough at that larger scale to maintain the extraordinarily high level of trust that made the "good parts of college" work so well before. I'm amazed they did it at 10K; Google ~2010ish was the most astonishingly functional and humane institution I've ever been a part of and one of the best I've ever known to exist in history.

Anyway, the point is: once the good parts went by the wayside, there was no longer any upside to keeping the bad parts. The activist clowns were no longer a price that had to be paid for letting people think and explore freely, because people weren't allowed to do those things anymore anyway. So the ROI balance tipped in favor of a crackdown.

Expand full comment

The (rightfully) fired Googlers are a product of universities allowing and encouraging performative hissy fits and tantrums over whatever the woke de jour thing happens to be (#metoo, BLM, Anti-Israel, etc.). The power to cancel other people (e.g., that James Damore guy) and implement their woke thought into products (e.g., the AI fiasco with the black nazi soldiers) likely emboldened the (now former) employees to actually believe they were entitled to cancel a $1.2B contract.

Expand full comment
Apr 22·edited Apr 22

"But a refusal to police streets and lock up criminals" -- you don't even necessarily need to "lock up" criminals. But there needs to be _some_ consequence that makes it so "crime doesn't pay". If you shoplift, you need to be nabbed, have the stuff you stole taken away, and then maybe you get an ankle monitor. Maybe you get locked up briefly, because they need to take you down to the station and book you or whatever. But long-term incarceration isn't necessarily required. The punishment of crime needs to be swift and certain, so people understand that if they do crime, they're gonna get caught and they won't like the consequences. It doesn't necessarily need to be severe, though.

Expand full comment

Why more pedestrian deaths at night in the United States?

My first hunch was inattention, distraction by cell phones, but that should be the same, day or night.

My second hunch is intoxication— alcohol, cannabis, and or other drugs, which is almost certainly more salient at night. It should be pretty easy to see what percentage of fatal accidents involve intoxication, and how this differs from daytime pedestrian deaths.

Expand full comment

Econ 102 sound like it would be great, IFF it had a transcript. The podcast format is very low information density. (I do not dispute the utility as a way of getting information while doing other things and this is fine for passive reception of the information. But if one potentially wants to engage as we do with the substack, a podcast is not useful.)

Expand full comment

Apropos of the article on “woke” policies impacting companies’ bottom lines, Elon Musk seems to have turned off 60% of customers who identify as Democrats, reducing Tesla sales.

This is the reason corporations adopted these policies in the first place: to avoid losing customers.

https://www.wsj.com/business/autos/elon-musk-turned-democrats-off-tesla-when-he-needed-them-most-176023af

Expand full comment

> "Not only does every department head and dean in every university lobby the administrators for more resources, but many profs and lecturers in the increasingly unpopular humanities have tenure and can’t be laid off. Faced with that constraint, universities naturally have trouble finding the money to hire new faculty to teach the STEM subjects that students increasingly want to learn."

I don't believe this is a reasonable argument, although I have no specific data to offer. The idea is that the deadweight of tenured faculty in humanities, where there is little student demand for courses, is choking growth in STEM department faculty. However, over the past twenty years the number of faculty in humanities has plummeted, both in total terms and in terms of the percent who are tenured--non-tenure-track faculty teach higher course loads and can be dismissed at the end of their short-term contracts and thus have minimal drag on university flexibility. Humanities courses are not empty: distribution requirements continue to steer students into lower-level courses in large numbers.

It seems to me more likely that limits on the growth of STEM departments concerns a general view that universities should reflect more than "customer demand." There is, indeed, very high demand for STEM training, but if a university were to reduce its course offerings to STEM training it would no longer meet the common definition of a university (which involves multiplicities of of schools and disciplines), particularly in the case of research universities like UCSD. So we see the continued existence of distribution requirements, which accord with a model of education that the "customers" may not desire, but which the "corporation" regards as its mission to deliver (the corporation in this case being individual universities in the context of peer institutions with which they are ranked and accreditation agencies that legitimatize the enterprise according to accepted standards). Students are like customers in some ways, but in other ways they are not, which is why there is the high administrative overhead of academic and non-academic advisors and counselors (which also limits the budgets for STEM departments).

If there were perfect supply in response to STEM demand other problems could emerge (and probably already have). The hype around STEM is so high that the numbers of graduates would exceed the market demand--lowering STEM-related job availability and salaries--diminishing the value of degrees and the demand for training. (Which may happen because of AI alone, although we'll have to see.) Significant percentages of students who demand lower-level STEM courses will discover that their abilities don't allow them to compete successfully, and universities will need to continue to offer alternative paths to which those students can turn to pursue degree success and careers that better fit their abilities.

Expand full comment

As far as the X/Twitter phenomenon goes, I suspect it will resolve itself as society matures with the technology.

People were glued to their radios in the 1930s, listening to President Roosevelt, supposedly (data?). And Hitler rose to power using radio and film to project himself as super-human, the way movie stars seemed to the population.

My grandfather used to say I couldn’t wrap my brain around how powerful movies were when he was a boy in the 20s and 30s. The old movie palaces suggest there was something to that. (Again, data? Complicated by the fact that movies have grown into a very expensive art form, that can generate billions in revenue now).

And while people still look up to movie and TV stars — there are still gossip circles in our society about them — we are less surprised, I think, when we discover these stars are also human.

So powerful was Hitler’s image that the way I was taught about him in the 80s, 40 years after his death, was still in mythological terms.

After seeing more of the world with my own two eyes and reading an article which referred to the history of amphetamines, it became as plain as day to me that Hitler was a meth addict and the Blitzkrieg was executed by dudes on meth.

I told this to my dad, and he immediately dismissed it. He couldn’t shake the story of his childhood. But over the past 15 years more and more people are starting to see some truth in this perspective. More articles and books and even documentaries are being made about it.

We are finally peeling back the veneer that was created in everyone’s minds by the media 90 years ago. And this is a stage of maturity around a medium. We are doing this with the music industry now and children’s TV (too much sex and too many power trips).

This is a part of at least American culture. We don’t hold our past traditions so sacrosanct that they don’t get questioned and debunked. This is baked into the founding of the country — with rules for Constitutional amendments and separation between church and state and early recognition that the press needed protected freedom.

This will happen to social media, too. It is happening already. Part of it comes with the sophistication not to believe everything you read and hear or see online. Those that do — that get turned into viral clips as they try to prattle of their Q Anon beliefs — look insane and very low social status. To be better than them, people are building mental checks & balances, so they don’t suffer ridicule.

The term “netiquette” is known in the Bay Area. I’m not sure how widespread it is. I don’t think it is yet being wielded effectively in the culture, but its day in the sun is inevitable. All the “woke,” “cancel-culture,” victim aggrandizing is a reaction to unmitigated disrespect and vulgarity that was unleashed with anonymity.

People already block and mute one another on social media. They share tips on Threads for how to deal with porn. X keeps advertising to me that I can pay them to stop seeing advertisements. We get that managing our information intake is a bonus for our mental health.

And eventually this will get refined into default behavior. Most people don’t talk during movies. Most don’t blast their music at 4 in the morning. “Serious” news is reported without profanity.

Yes, there are laws about this, and there are tedious reminders after movie trailers about proper theater etiquette (as well as the f***ing post-pandemic ad at AMC theaters that still plays, with Nicole Kidman encouraging us to risk Covid and get out of the house to watch movies.) But those exist in a social context, with support.

Lawmakers don’t just fart legislation after lunch. It’s argued about and crafted and bribed and cajoled into existence, with lots of screaming on Xitter — some of which I am a chorus to (mandate landlords install 220V outlets for EV charging in renters parking spots in CA!) It’s part of how we see the world should become after experiencing it.

Just a couple weeks ago a law was passed in Florida restricting children’s access to social media the way we do liquor, pornography, cars and, in some places, even guns. (Gotta get the important stuff under control first, right?) We’re figuring out how to regulate it. And we’re starting (?) to tell one another online, “This is inappropriate. Communicate without using derogatory language. Back that up with a reliable source. Is that image photoshopped? This clip is too short. It needs context” etc.

“Noah Smith, more examples! More data!”

Even Trump’s trials seem to be related to this maturation process. He built himself up through manipulating the media, calling newspapers and radio talk shows in the 80s and 90s to plug himself — sometimes anonymously. He got on TV and got on Twitter to build on that and develop his following.

And part of the legal pressure he’s being subjected to are tweets as evidence of what he said and did, particularly as President. We are reckoning with the power this medium gave him (or at least pretending like we’re reckoning with it. I mean WTF is wrong with the justice system that it didn’t figure out it wouldn’t ever get around to trying him if he became President again and it needed to get this sorted out in less than four years? This is just the tip of the iceberg, too. 6th amendment folks: Google it. 😤)

At some point there may even be widely understood netiquette about responding to blogs with long screeds, and I may become embarrassed by this…

Expand full comment
Apr 22·edited Apr 22

Wokeness will naturally come to end as the world will become more realist. The US/China competition is very healthy for us somewhere, sadly it's the highest risk competition there is. But you can see what happens to the psyche of the people within a superpower nation-state when they live in a unipolar world: We start attacking ourselves.

The human mind wasn't made to sit by itself in a room alone. Social isolation makes us go bonkers. In a way The West sat by itself alone in a room for the past 40 years.

Expand full comment

I really want these as separate posts! The "No, U.S. pedestrian deaths aren’t due to bigger vehicles" is SO GOOD - and I want it to have its own headline / post.

Expand full comment

To the point on crime.

1. NYC is still much more dangerous than an average European city.

2. This research proves what people have intuitively known forever probably.

Instead of making the usual “thanks for telling us something average person already knows” argument, ill ask a question. What other interesting examples can you think of where economists have proven through research a point that an average person on the street could be assumed to know intuitively already (or even better, that the populace was proving through their actions way before economists “proved” it)?

Expand full comment

No stockholder wants the company they own shares of to pollute rivers, kill humans with their products, falsify accounting numbers or commit fraud. In the case of Boeing (to which I own stock) I would prefer it put bolts on doors so they don’t fly off the plane. So let’s just say none of the above has anything to do with “being a good citizen”.

I don’t care if my stocks are in companies that recycle. I don’t care because there are federal, state and local laws that govern environmentalism. I don’t want my corporation to be concerned with social problems. That is the job of the government, specifically.

The job of companies with shareholders is to make them wealthier. To increase the value of their stock they hold. They do that by being well run companies who sell goods and services that other companies and people want. It really is not more complicated. They have no social duty to cure Aids or end Homelessness. Do I care that they give money to the local homeless shelter or philharmonic? Not really but it should serve a purpose. A big factory in a factory town, needs to be a good neighbor. That is just smart, the last thing you want is to have a bad standing with local political leaders. Charitable giving is fine but certainly not necessary. Make money and preserve the investment made by stockholders. That is the mission. The only mission.

Expand full comment

It looks like the cap at UCSD is related to changing ones major, not for incoming students.

"Beginning in Summer 2025, currently enrolled students who want to switch into a selective major will be able to apply to selective majors once per year (between Summer and Fall quarters). The selection criteria for entry to the major will consider academic achievement in the specified screening courses and will also be aligned with UC San Diego’s priorities of serving California residents, first-generation college students, and students from low-income families. Continuing students who apply to switch to a selective major must have completed the required screening courses for that major and be in good academic standing. They will then be considered for the major using a point system that awards one point each for having a 3.0 GPA or higher in the major screening courses; California residency; Pell Grant eligibility; and first-generation college status (as determined by information received at the time of initial admission to UC San Diego).

Students with the highest number of points will be admitted until all available spaces within the major have been filled."

While it looks like spots are capped, this really isn't what the tweet is talking about and is I think somewhat misleading. I think your comments still hold true in that schools should be working to increase the sizes of those departments, but it's not like UCSD is telling the kids of college graduates that they can't be computer science majors.

Expand full comment

On Noah's first thing -- the decline of DEI/ESG activism as competition increases:

This may be a stretch, but I'm struck by the parallels in biology, where we find increasing incidence of flamboyant display among species in environments where predators are few, as with the colorations and mating displays of birds of paradise. The more competition for survival, the more focus on that one thing, and the less variation in behavior and performative display.

Expand full comment