122 Comments

It's not just in teaching that universities suffer extreme allocation inefficiencies. It also shows up in the difficulties they have in organizing even very basic cross-field or cross-institutional research collaboration. A couple of examples from recent years that stick in my mind:

1. Public health researchers recommend extremely destructive policies, then turn around and say "you can't expect us to think about the economic impacts of our suggestions/demands, we aren't economists". Alternatively, profs like Ferguson write code full of bugs so it doesn't model what they think it's modeling, and then say "you can't expect us to write working code, we aren't computer scientists". Imperial College London even said to the British press "epidemiology is not a subfield of computer science" in defense of Ferguson's crap work.

If you say, but you work down the hall from economists/computer scientists, so why didn't you talk to them ... well academics just look at you blankly. The thought that they might go build a team with the right mix of specialisms to answer a specific question just doesn't seem to occur to them, even though they easily could. This is a big driver of the replication crisis too, because universities are full of "scientists" who struggle so badly with statistics and coding that they produce incorrect papers all the time, and it becomes normalized. They can't recruit professionals to help them because they don't pay enough and they don't pay enough because they allocate their resources to maximizing quantity (measurable) over quality (not measurable).

2. In computer science it's not all roses either. Billions of dollars flow into universities via grants, yet when it comes to fields like AI they constantly plead poverty and that it's too hard to compete with corporate labs. Why don't they pool resources, like the physics guys do in order to build particle accelerators? Again, cultural issues. There's a limited number of names that go on a paper, a limited amount of reputation to go around, and pooling resources means most researchers will toil on unglamorous work. So they prefer to split up their resources over gazillions of tiny teams that mostly produce useless or highly derivative "me too" output.

The result is that universities *appear* to be highly productive and important, if you measure by their preferred metrics like paper production. If you measure them by actual real world impact of implemented research outcomes, they are well behind and falling further.

China is no better by the way. They produce more papers than America does now but nobody reads them, not even in China. Way too much outright fakery.

Expand full comment
Apr 22·edited Apr 22

"2. In computer science it's not all roses either. Billions of dollars flow into universities via grants, yet when it comes to fields like AI they constantly plead poverty and that it's too hard to compete with corporate labs. Why don't they pool resources, like the physics guys do in order to build particle accelerators? Again, cultural issues."

This statement false. The reason AI scientist don't pool together within academia to do big research is because

1. Important AI scientists get bought immediately to run big divisions within industry.

2. Or, AI academics pool together in a company and then get bought by industry.

3. 90% of the funding AI academics receive is from Big Tech anyway.

Basically. The reason AI academic labs don't get billions of dollars of compute from academia is because the AI/Industry connection is extremely deep. So deep, there is basically no divide.

My experiences as a former AI scientist in a T1 lab are the following:

1. AI researchers often pooled themselves together, they then called this 'a company' and then Google/Facebook bought this company and bought the team and its research. This happened so much it got to the point where an AI lab would just take the lab, put the lab in a company and Google buys company, and now they do research for google. Rinse repeat. Every. single. time. It was a cliche almost.

2. There was not a single top tier academic that hadn't work extensively in industry on salaries worth millions. Some came back to academia, some stayed forever in industry. And even after they came back to academia, they'd often still have part-time positions at Big Tech somewhere.

3. All the research you are doing is funded by Big Tech, anyway. After you do your PhD you walk into the same team of people at Deepmind that you were working with before at Oxbridge but that just graduated a bit earlier.

Expand full comment

I'm well aware of all that, it's my point! Academia is a socialist-style reputation economy that's inherently very bad at resource allocation, and so they constantly suffer mass "defections" as the people who can escape from the poverty and squalor do so. Yet those resource allocation policies are the justification for academia's existence.

Every example you give is like that:

1. Universities do not react to market signals, so lose people due to insufficient pay.

2. Universities do not foster collaboration and frequently disincentivize it, so researchers collaborate by setting up companies/moving to the private sector.

3. Universities do not shift funding as breakthroughs are made. How much value is being created by AI vs social studies, right now? There's been a huge shift in just a few years. Now ask how much funding have universities reallocated from social studies to AI? Answer: almost none, hence why academic labs are all hardware/data starved. And now they are blocking people from even studying CS at all because of DEI/wokeness.

Universities are really, really bad at resource allocation, so, why do we give them so many resources?

Expand full comment

ah good point. Yeah agreed. Universities are awful at resource allocation.

Expand full comment

Interdisciplinary work is hard. Disciplines exist because the way to develop specialized knowledge involves developing a community of people who work together (both in collaboration and competition) to figure out ways to push the field forward. Interdisciplinary work means not just the ordinary problems of trying to figure something out that no one has figured out before, but the additional difficulties of doing so without established common language and research methodologies.

When interdisciplinary work is successful, it’s extremely valuable. But it is very hard for it to be successful. If it is successful enough that it continues, it often develops into a new discipline, with its own siloed community and specialized language and methodology that are hard for outsiders to work with.

Expand full comment

It's not that hard. Companies manage it all the time and don't sweat it. The new discipline of AI is a mix of people with a maths/stats background, systems engineering, user interface design, hardware designers and more. These are traditionally different (sub)fields, but if you look at what problems are faced by AI projects and what AI companies talk about when they talk about challenges, inter-disciplinary collaboration never comes up.

Expand full comment

Companies don't actually do long-range foundational research the way that academia does. There was a period when Bell Labs did a bit, but that was fleeting.

Expand full comment

This is a common claim but I don't believe it anymore. A lot of it revolves around what exactly basic or foundational research actually means, as it's very easy to play No True Scotsman and shift definitions around such that anything that turns out to be useful is immediately redefined to be non-foundational.

Still, here are 10 arbitrary examples of long term commercial "foundational" research projects, that aren't Bell Labs:

1. Google's whole AI effort leading up to transformers and ChatGPT. Years of pure algorithms and hardware research.

2. SpaceX reusable rockets, Starship, Starlink inter-satellite beaming. Fundamental physics and engineering challenges, solved.

3. Meta, Apple's investment into virtual and augmented reality. All done way ahead of market demand.

4. Epic Games' work on Nanite and Lumen. Major algorithmic breakthroughs in compute graphics.

5. Microsoft Silica. At least 7 years of work now on developing entirely new storage tech (writing data to glass plates! very sci fi)

6. Boston Dynamics' work on agile robots

7. IBM's work on homomorphic encryption

8. Intel/AMD/Microsoft did ~all the core research on confidential computing

9. ASML's development of EUV lithography

10. Pretty much anything coming out of the defense industry e.g. the development of titanium required big breakthroughs in materials science, all done by the private sector (sometimes under military contracts, but the work wasn't done by academia).

Now the usual response is to say, well, none of those examples are "foundational". But I disagree. Technologies as diverse as AI, titanium and reusable space rockets are all far more foundational to human progress than, say, studies into intersectionality (a favorite of academia, almost nowhere to be found in commercial research).

Expand full comment

I'll grant that transformers, and homomorphic encryption, are foundational in the relevant sense. But none of the rest of these things are as foundational as learning about the tree of life, or understanding dark matter, or coming up with the idea of computable functions, or understanding the ideas of intersectionality. As you say, these are pretty close to immediately commercializable ideas. Academia exists primarily to do the level of research that *isn't* immediately commercializable, that later enables commercializable (or at least, actionable - some things are done through government rather than commerce) research to finish things up.

Expand full comment

Sigh, definition games. As expected. It's the standard response.

Can you give a precise definition of what foundational means to you? From your examples I can't derive it. The transformers paper is all about how to improve Google Translate, and within a couple years of it being released it was being commercialized for profit. Why do you say that is foundational but Meta VR isn't?

Are you sure your feeling of "foundationalness" isn't driven by superficial aesthetics, like whether the work smells of chalkboard rather than engine grease? Whether it's in the name of entertainment or medicine? Because I don't think these are important distinctions.

> Academia exists primarily to do the level of research that isn't immediately commercializable, that later enables commercializable

So what's the game plan then. It's been decades, any interest in commercializing intersectionality yet? No? I think it's because this field has no ideas that are useful for anything. Likewise for most of what academia outputs. Where's the commercial demand for Marxist sociologists and why does academia produce so many? These fields aren't lacking in commercial interest after decades of work because universities are so incredibly far-sighted, they're just genuinely worthless.

Expand full comment

Good points. Agree on China (and applies to India). The mass of low to mid tier engineers are not concerning to me. We should certainly create more of our own mid tier engineers at state universities to be manufacturing foremen and working in fabrication (in addition to being program/project analysis, replacing low to mid tier foreign workers brought in on H1Bs), but this would require some investment in apprenticeships and training by the tech companies

Expand full comment

Public health researchers recommend extremely destructive policies, then turn around and say "you can't expect us to think about the economic impacts of our suggestions/demands, we aren't economists"

We expect then not to recommend policies at all. Give others the epidemiological expertise they have and let THEM make the policies.

Expand full comment

In my experience as an academic scientist, the major blocker for interdisciplinary research is funding agencies. Program managers are very defensive of their turf and set up bureaucratic hurdles that make novel collaborations not worth it.

Expand full comment

Yes, the funding model is at the root of it all.

Expand full comment
Apr 22·edited Apr 22

I had almost forgotten the Imperial College code "scandal", which was part of the general axe-grinding against public health that was all the rage during the 2020 lockdown times. I guess I had hoped that after the pandemic this nonsense would fade away. For what it's worth I *am* a computer scientist and I looked at the code critique at the time it was offered: it was mostly a nothingburger [1]. I am suprised to see someone bringing this up four years later. However I am not particularly surprised to see it in a post that rails against public health researchers.

As a general rule I (a University professor) am thrilled to see people criticize the academic research system, since there are many things to say that might result in improvements. I do think it's harder to take the criticism seriously when it clearly comes with a political agenda, and justifies itself with facts that any expert would dismiss immediately.

[1] https://news.ycombinator.com/item?id=23222683

Expand full comment

Yeah I'm a computer scientist too, but unlike you, I'm not an academic. I work in an industrial lab. Nice link. It's full of people trying to explain to you why you're wrong! You said:

> Do you tell the government you can't get them any preliminary data because your multithreading system doesn't work

... as if that's even a sane question to ask! Yeah if you can't do your calculations correctly and know it then YES you should tell the government you can't help them. Like, you realize that in the commercial world people can go to prison for claiming their numbers are correct when they know they are wrong?

There were bugs in the PRNG man. Memory corruptions. No tests. How can anyone take your cryptography work seriously if you blow off people pointing out PRNG bugs as having a "political agenda". What has politics got to do with that? Should academics be expected to solve equations correctly or not? The only people with a political agenda here are you, and Ferguson, and Imperial College, and Claudine Gay, and every other academic out there who ALWAYS seem to claim their critics aren't arguing in good faith, even when they're pointing out completely objective problems like model bugs, plagiarism or fraud.

Expand full comment
Apr 22·edited Apr 22

In cryptography code, which is typically part of a security system, memory corruptions really matter. They matter because security-critical code is subject to adversarial inputs, and memory issues can lead to exploitation. This is obviously *not* an issue for research code run on a private computer by scientists (which is always full of memory issues when it's written by academics in non-memory safe languages.) Comparing the two situations is ridiculous, and totally out of context. Now if the memory issues led to widespread incorrect simulation results, that would also be interesting. But the authors of the criticism were unable to substantively make that case, so they settled for a weak claim and hoped that nobody with knowledge would look more closely.

Similarly, why in the world would you care that code is running single-threaded or multi-threaded? If the single-threaded code is more stable and produces results in an acceptable timeframe, then run it single threaded. "Your apparently correct code runs too slow," is something you might write if you set out to criticize a piece of code for any reason you can find, and then you realized that your substantive criticisms weren't impressive enough. It reads as an attempt to pull the wool over non-experts' eyes by disguising an efficiency question as a correctness one.

As far as "good faith", people who are arguing in good faith don't write phrases like "on a personal level, I’d go further and suggest that all academic epidemiology be defunded." I'm glad they wrote it though: it's much cleaner when people just come out and admit their biases.

Expand full comment

The Ferguson papers were highly influential in shutting down Europe and the USA especially the hospital systems, and you think it's just private research code? How is that private? They presented the outputs to governments and demanded that sick patients were ejected from hospitals. Show us one crypto library that is 1/100,000th as impactful to people's health.

You don't give the impression you read the original criticisms. It sounds like you'd accept someone publishing a new cryptographic algorithm with a set of test vectors that nobody can reproduce, where the reference implementation leaks your private keys due to bugs and where the authors didn't bother explaining why they think it's secure in the first place. A threading bug was the first bug talked about but there were typos in RNG constants, buffer overflows, dependencies on CPU features, accumulated floating point uncertainty. Zero tests of any kind for 15,000 lines of C! Go read the article your HN comment was about again. A bunch of third parties ran the code in single threaded mode and nobody could get it to spit out even close to the same numbers Ferguson reported. Nobody was talking about efficiency, the criticisms were all about correctness bugs and the Ferguson Lab's attitude to bugs (saying they don't matter!)

But ignore that for a second. The simulation results were "scientifically wrong" too, due to bad assumptions and bad data. I remember that period. Other people were pointing out the models were all unvalidated and the predictions weren't matching reality in the same time period the bugs were revealed and academics didn't care about those criticisms either. Like a team in Sweden did an independent reimplementation of the model and ran it for Swedish data, it predicted as many deaths in 2020 as they have in total from all other causes in normal years. Swedish government ignored them, providing a counterfactual. Now pull up total deaths in Sweden in the years around 2020 and compare to that prediction. The models were way off. Academics didn't care and just complained that they shouldn't be judged by non-"experts". So then programmers said well we're experts in coding and your code is full of bugs, and they got upset about that too.

> I'm glad they wrote it though: it's much cleaner when people just come out and admit their biases.

Demands for Ferguson's lab to be shut down should have been coming from academics themselves. Where are your standards? Expecting scientists to do math right is a biased political agenda wtf.

Expand full comment

Former Googler here. Competition is definitely part of the picture, but scale- and corporate age-driven culture change is also part of it and would have happened anyway.

Google in the 2000s and 2010s had a very, very university-like culture. That was partly because the founders came out of grad school and hired a bunch of academics, partly because the smartest and most productive people of that time appreciated the non-material returns of a collegiate atmosphere, so pairing that with a fat paycheck made for an exceptionally good recruiting pitch. And as we know, collegiate culture has upsides and downsides. Allowing for open-ended exploration that can lead to amazing innovation is part of the upside; tolerating full-of-BS entitled activists is part of the downside.

In the late 2010s senior execs began systematically eroding the good parts. 20% projects got less respected and less common, internal informational openness got restricted in response to leaks, and the performance review/promotion cycle led people more and more to do box-checking things instead of innovative things. Pressure from competitors (and poorly executed fear responses to competitors) was indeed a major motivator for these changes, but I think some change like this was probably inevitable as the scale of the company went from ~10K to ~100K. It's just too hard to weed out freeloaders, rest-and-vesters, leakers, and other abusers reliably enough at that larger scale to maintain the extraordinarily high level of trust that made the "good parts of college" work so well before. I'm amazed they did it at 10K; Google ~2010ish was the most astonishingly functional and humane institution I've ever been a part of and one of the best I've ever known to exist in history.

Anyway, the point is: once the good parts went by the wayside, there was no longer any upside to keeping the bad parts. The activist clowns were no longer a price that had to be paid for letting people think and explore freely, because people weren't allowed to do those things anymore anyway. So the ROI balance tipped in favor of a crackdown.

Expand full comment

I was there in the <10k employee days. Funny to hear how it was supposedly like a university. I loved it there because it was nothing like my university experience, beyond the appearance of "lectures" and "coursework" in the first "semester" of training.

For example, the work mattered. You got paid to be there, instead of paying to be there. The people were competent (and hardly any former academics, who are you thinking of there?). Most of my colleagues were grizzled veterans actually, not as many young people as was often claimed. You weren't forced to work on or study irrelevant things. No artificial deadlines, only a few natural ones. A notable _absence_ of clownish activists - beyond a bit of background level 2005-era feminism it wasn't a particularly political culture. That's a post 2015 problem. Yes there were a few entitled brats but they complained about trivial stuff like food or being required to wear shoes in the office.

It was a big upgrade.

Expand full comment

I think that's a valid perspective too-- people have different university experiences that they bring to that comparison, for sure. Re: former academics, I'm thinking of folks like Amit Singhal, Udi Manber, and Peter Norvig. Veterans indeed, but folks who still brought a researcher's mentality and curiosity to their roles, and that plus the prevalence of academic paper study groups (not just in CS either, I once helped bring a local pure math conference to campus) and prominent guest speakers and people socializing around their nerdy hobbies-- all things I'd come to love about the academic environment-- were some of the things that made it feel university-like in a positive sense to me.

Expand full comment

Yeah fair enough. Tech talk culture is something I miss from those days.

Expand full comment

Good comment from former insider. Perhaps also the college dynamic has also gotten way less tolerant. Their way or the highway. I can't tell for certain, but it appears in the few pictures I've seen that the fired employee are all pretty young. Perhaps they felt that could actually get what they want - or at least were "entitled" to their hissy fit, because this is what they are used to from college culture. IDK.

Expand full comment

The (rightfully) fired Googlers are a product of universities allowing and encouraging performative hissy fits and tantrums over whatever the woke de jour thing happens to be (#metoo, BLM, Anti-Israel, etc.). The power to cancel other people (e.g., that James Damore guy) and implement their woke thought into products (e.g., the AI fiasco with the black nazi soldiers) likely emboldened the (now former) employees to actually believe they were entitled to cancel a $1.2B contract.

Expand full comment
Apr 22·edited Apr 22

"But a refusal to police streets and lock up criminals" -- you don't even necessarily need to "lock up" criminals. But there needs to be _some_ consequence that makes it so "crime doesn't pay". If you shoplift, you need to be nabbed, have the stuff you stole taken away, and then maybe you get an ankle monitor. Maybe you get locked up briefly, because they need to take you down to the station and book you or whatever. But long-term incarceration isn't necessarily required. The punishment of crime needs to be swift and certain, so people understand that if they do crime, they're gonna get caught and they won't like the consequences. It doesn't necessarily need to be severe, though.

Expand full comment

Why more pedestrian deaths at night in the United States?

My first hunch was inattention, distraction by cell phones, but that should be the same, day or night.

My second hunch is intoxication— alcohol, cannabis, and or other drugs, which is almost certainly more salient at night. It should be pretty easy to see what percentage of fatal accidents involve intoxication, and how this differs from daytime pedestrian deaths.

Expand full comment

My hypotheses would be (1) reduced enforcement of traffic laws, which we have clear evidence for in cities like San Francisco (2) drugs and alcohol in both drivers and pedestrians, exacerbated by nonenforcement of DUI laws (3) cell phones. The latter could be worse at night because the bright screen probably messes up your peripheral vision. (2) would be particularly unfortunate because there was a hot minute there where it seemed like Uber/Lyft were going to solve the “getting home from the bars” problem.

Expand full comment

The fact that Uber/Lyft are no longer artificially cheap may have something to do with the bar problem.

Expand full comment

The issue raised is not why there are more pedestrian deaths at night, but why there has been an increase in pedestrian deaths at night since 2009. The outlier status of the US seems entirely due to night increase since 2009 (although Noah's charts don't show night/day breakdowns for other countries, so we can't be sure of the degree). Night deaths fell steadily here for 30 years and then abruptly grew steadily for the next dozen.

I think the timing (2009) does suggest cell phone use, which the Financial Times article to which Noah links notes is more prevalent in the US than elsewhere. It's not that cell phones are not used at all times of day, but the danger they pose in combination with reduced visibility could account for why dusk is the threshold when their lethality emerges.

Expand full comment
Apr 22·edited Apr 22

It could also be headlights that are too bright and mounted too high destroying the ability of drivers to see far enough to the side of the road to notice pedestrians. And oncoming bright and high headlights can destroy the ability to notice literally anything other than said headlights.

Though intoxication does sound like a promising explanation. Notably, both Japan and South Korea had larger improvements in road safety than most European countries over the 2010s, and are both much more anti-drug. Anecdotally a lot of the people I knew who were heavy users of weed in the US would drive while high, often at night to satisfy the munchies.

Expand full comment

Yes! headlights!

Expand full comment

I think one contributing factor to night-time pedestrian deaths is pedestrians failing to wear white or reflective garments. I am occasionally horrified by the sight of pedestrians wearing dark clothing emerging from the darkness just a few feet away from my car.

Expand full comment

I don’t think people’s clothing choices have changed in any relevant ways over this past decade though. It’s not like white clothing used to be more popular, and deaths rose when it fell in popularity.

Expand full comment

Maybe it was just a local thing, but where I grew up, “Wear White at Night.” was a ubiquitous PSA. I haven't seen or heard it anywhere on TV or the Internet in many years.

Expand full comment

The country famous for businessmen wearing asphalt black suits passed drunk on in the street at night had 2.2 traffic related deaths per 100k people in 2023, one of the best in the world. 13 of Tokyo(-to)'s 132 traffic deaths in 2022 were of people asleep on the carriageway, so passing out drunk on the street is clearly not a safe hobby, but dark clothing can't really explain US under-performance in road safety.

If anything I saw a lot more people walking and biking at night covered in retroreflectors when I lived in California than here in Tokyo, so clothing choices is something that widens the road safety gap that has to be explained by the real reasons.

Expand full comment

Interesting! High income people in Houston use reflective clothing; low-income Houstonians do not. A friend of mine on a motorcycle almost hit a man walking a dog in the middle of the street. The only thing he saw was the reflective leash a second before impact. The pedestrians survived, but the dog did not.

Expand full comment

In my county seems to be a combo of intoxication, immigrants (inexperienced and sometimes drunk drivers hailing from countries with dangerous roads) and young people out and about doing stupid things (lack of enforcement plus lack of other outlets during covid made cruising and looking for fun or trouble more popular?). This is unscientific from reading the local paper every day.

Some of it appears to be somewhat due to bad judgement by pedestrians (staggering homeless person wearing dark clothing in lane). Drivers in CA need to be aware of this risk of an unexpected person standing in your lane or on the shoulder at night just as drivers in rural areas are always looking for deer and elk on the side of the road.

Expand full comment

Meth too, I guess.

Expand full comment

I suspect it is an interaction effect between people getting worse at rule-following, drug/alcohol use, and police deciding to limit enforcement for low-level offenses. If you jaywalk during the day and are sober, the risks are probably very low and you can get away with it. At night, and under the influence, the odds catch up to you.

Nuanced rules are difficult to enforce, and police have largely stopped enforcing low-level offenses (or, in lawyer speak, the police are supposed to weigh the governmental interest in a specific action). All of this combines to create a situation where we see a marked increase in deaths at high-risk places/times but in low risk situations nothing much changes.

Expand full comment

Econ 102 sound like it would be great, IFF it had a transcript. The podcast format is very low information density. (I do not dispute the utility as a way of getting information while doing other things and this is fine for passive reception of the information. But if one potentially wants to engage as we do with the substack, a podcast is not useful.)

Expand full comment

Apropos of the article on “woke” policies impacting companies’ bottom lines, Elon Musk seems to have turned off 60% of customers who identify as Democrats, reducing Tesla sales.

This is the reason corporations adopted these policies in the first place: to avoid losing customers.

https://www.wsj.com/business/autos/elon-musk-turned-democrats-off-tesla-when-he-needed-them-most-176023af

Expand full comment
Apr 22·edited Apr 22

Also, most companies that do "woke" ad campaigns still try to give a pretty bland, focus-grouped, kumbaya kind of message to offend the least number of people, while Musk's tone is very abrasive.

Expand full comment

He peed in the Howler Monkey pool, without considering that the Howler Monkeys buy Teslas. Probably most democrats are not the Twitter Howler Monkeys, but they are still tribal and have been told by the MSM that Musk is bad man, since the MSM are very much those monkeys.

I have a neighbor (now approaching 70) who has been extremely success selling gold/coins. He never graduated from high school. But he's is one of those can sell snow to Eskimos type of person. He knows how to read people, always smile, never insult, etc. He does not care if you are a climate crazy or a flat earther or Christian or Atheist; whether you are someone who want to prohibit all abortions or someone who wants abortion perfectly legal and free until the kid is 7. He just wants to close the deal.

Expand full comment

> "Not only does every department head and dean in every university lobby the administrators for more resources, but many profs and lecturers in the increasingly unpopular humanities have tenure and can’t be laid off. Faced with that constraint, universities naturally have trouble finding the money to hire new faculty to teach the STEM subjects that students increasingly want to learn."

I don't believe this is a reasonable argument, although I have no specific data to offer. The idea is that the deadweight of tenured faculty in humanities, where there is little student demand for courses, is choking growth in STEM department faculty. However, over the past twenty years the number of faculty in humanities has plummeted, both in total terms and in terms of the percent who are tenured--non-tenure-track faculty teach higher course loads and can be dismissed at the end of their short-term contracts and thus have minimal drag on university flexibility. Humanities courses are not empty: distribution requirements continue to steer students into lower-level courses in large numbers.

It seems to me more likely that limits on the growth of STEM departments concerns a general view that universities should reflect more than "customer demand." There is, indeed, very high demand for STEM training, but if a university were to reduce its course offerings to STEM training it would no longer meet the common definition of a university (which involves multiplicities of of schools and disciplines), particularly in the case of research universities like UCSD. So we see the continued existence of distribution requirements, which accord with a model of education that the "customers" may not desire, but which the "corporation" regards as its mission to deliver (the corporation in this case being individual universities in the context of peer institutions with which they are ranked and accreditation agencies that legitimatize the enterprise according to accepted standards). Students are like customers in some ways, but in other ways they are not, which is why there is the high administrative overhead of academic and non-academic advisors and counselors (which also limits the budgets for STEM departments).

If there were perfect supply in response to STEM demand other problems could emerge (and probably already have). The hype around STEM is so high that the numbers of graduates would exceed the market demand--lowering STEM-related job availability and salaries--diminishing the value of degrees and the demand for training. (Which may happen because of AI alone, although we'll have to see.) Significant percentages of students who demand lower-level STEM courses will discover that their abilities don't allow them to compete successfully, and universities will need to continue to offer alternative paths to which those students can turn to pursue degree success and careers that better fit their abilities.

Expand full comment

Especially because in the case of someone selling an education, by definition the customers don’t know what they’re getting and generally don’t have the best judgment about what is valuable about it.

Expand full comment

That’s interesting: pedestrian deaths are mostly urban, non-intersection at night time and higher on weekends. Which kind of pedestrians are those most likely to be? Also, most of the nighttime deaths were in areas with streetlights, so more street lights may not help much, which is important to note for Tucson since it is a dark skies area with limited street lighting.

Expand full comment

Well, we are supposed to be a dark skies area with limited street lighting. It is better than it could be, but far far from a "dark Sky" area. I live on the far east side, and do, well did, some astrophotography. The "sky bubble" over Tucson, has steadily reached upwards and eastwards towards me.

Downtown, where most of our sad pedestrians are killed, is well let. We also have a very high street bicycling community - included Tour training. We have a larger than normal bike fataliities as well.

I still rest on my two factors:

a. distracted walkers on cell phones

b. older drivers, night time visual acuity and target (ie seeing things moving) reduction as contrast is lower with age

Expand full comment

As far as the X/Twitter phenomenon goes, I suspect it will resolve itself as society matures with the technology.

People were glued to their radios in the 1930s, listening to President Roosevelt, supposedly (data?). And Hitler rose to power using radio and film to project himself as super-human, the way movie stars seemed to the population.

My grandfather used to say I couldn’t wrap my brain around how powerful movies were when he was a boy in the 20s and 30s. The old movie palaces suggest there was something to that. (Again, data? Complicated by the fact that movies have grown into a very expensive art form, that can generate billions in revenue now).

And while people still look up to movie and TV stars — there are still gossip circles in our society about them — we are less surprised, I think, when we discover these stars are also human.

So powerful was Hitler’s image that the way I was taught about him in the 80s, 40 years after his death, was still in mythological terms.

After seeing more of the world with my own two eyes and reading an article which referred to the history of amphetamines, it became as plain as day to me that Hitler was a meth addict and the Blitzkrieg was executed by dudes on meth.

I told this to my dad, and he immediately dismissed it. He couldn’t shake the story of his childhood. But over the past 15 years more and more people are starting to see some truth in this perspective. More articles and books and even documentaries are being made about it.

We are finally peeling back the veneer that was created in everyone’s minds by the media 90 years ago. And this is a stage of maturity around a medium. We are doing this with the music industry now and children’s TV (too much sex and too many power trips).

This is a part of at least American culture. We don’t hold our past traditions so sacrosanct that they don’t get questioned and debunked. This is baked into the founding of the country — with rules for Constitutional amendments and separation between church and state and early recognition that the press needed protected freedom.

This will happen to social media, too. It is happening already. Part of it comes with the sophistication not to believe everything you read and hear or see online. Those that do — that get turned into viral clips as they try to prattle of their Q Anon beliefs — look insane and very low social status. To be better than them, people are building mental checks & balances, so they don’t suffer ridicule.

The term “netiquette” is known in the Bay Area. I’m not sure how widespread it is. I don’t think it is yet being wielded effectively in the culture, but its day in the sun is inevitable. All the “woke,” “cancel-culture,” victim aggrandizing is a reaction to unmitigated disrespect and vulgarity that was unleashed with anonymity.

People already block and mute one another on social media. They share tips on Threads for how to deal with porn. X keeps advertising to me that I can pay them to stop seeing advertisements. We get that managing our information intake is a bonus for our mental health.

And eventually this will get refined into default behavior. Most people don’t talk during movies. Most don’t blast their music at 4 in the morning. “Serious” news is reported without profanity.

Yes, there are laws about this, and there are tedious reminders after movie trailers about proper theater etiquette (as well as the f***ing post-pandemic ad at AMC theaters that still plays, with Nicole Kidman encouraging us to risk Covid and get out of the house to watch movies.) But those exist in a social context, with support.

Lawmakers don’t just fart legislation after lunch. It’s argued about and crafted and bribed and cajoled into existence, with lots of screaming on Xitter — some of which I am a chorus to (mandate landlords install 220V outlets for EV charging in renters parking spots in CA!) It’s part of how we see the world should become after experiencing it.

Just a couple weeks ago a law was passed in Florida restricting children’s access to social media the way we do liquor, pornography, cars and, in some places, even guns. (Gotta get the important stuff under control first, right?) We’re figuring out how to regulate it. And we’re starting (?) to tell one another online, “This is inappropriate. Communicate without using derogatory language. Back that up with a reliable source. Is that image photoshopped? This clip is too short. It needs context” etc.

“Noah Smith, more examples! More data!”

Even Trump’s trials seem to be related to this maturation process. He built himself up through manipulating the media, calling newspapers and radio talk shows in the 80s and 90s to plug himself — sometimes anonymously. He got on TV and got on Twitter to build on that and develop his following.

And part of the legal pressure he’s being subjected to are tweets as evidence of what he said and did, particularly as President. We are reckoning with the power this medium gave him (or at least pretending like we’re reckoning with it. I mean WTF is wrong with the justice system that it didn’t figure out it wouldn’t ever get around to trying him if he became President again and it needed to get this sorted out in less than four years? This is just the tip of the iceberg, too. 6th amendment folks: Google it. 😤)

At some point there may even be widely understood netiquette about responding to blogs with long screeds, and I may become embarrassed by this…

Expand full comment

netiquette is a term from the 90s. It had its day in the sun a long time ago! Maybe it'll come back into fashion though.

Expand full comment
Apr 22·edited Apr 22

Wokeness will naturally come to end as the world will become more realist. The US/China competition is very healthy for us somewhere, sadly it's the highest risk competition there is. But you can see what happens to the psyche of the people within a superpower nation-state when they live in a unipolar world: We start attacking ourselves.

The human mind wasn't made to sit by itself in a room alone. Social isolation makes us go bonkers. In a way The West sat by itself alone in a room for the past 40 years.

Expand full comment

I agree that people do tend to stay out of certain kinds of trouble when focused on an absorbing task such as competing with China.

It's a shame that we cannot be more content when absorbed in a less critical exploration of the diverse nooks and crannies of the universe.

John Adams, one of the architects of the American experiment, wrote:

"I must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy. My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry and Porcelaine."

Expand full comment

That’s an interesting way to frame it. I tend to frame it oppositely, which is that having a common enemy lets people put their differences aside. Then again, we still went through a lot of violent, internal strife in the 60s and 70s, during the height of the Cold War.

I recommend the book “Season of the Witch,” which is about San Francisco during those decades (after a very compelling opening about one guy’s life in the decades prior, which serves to show a bit of the contrast). It’s a vivid counter-example to this thesis you and I have in our heads.

Expand full comment

I really want these as separate posts! The "No, U.S. pedestrian deaths aren’t due to bigger vehicles" is SO GOOD - and I want it to have its own headline / post.

Expand full comment

To the point on crime.

1. NYC is still much more dangerous than an average European city.

2. This research proves what people have intuitively known forever probably.

Instead of making the usual “thanks for telling us something average person already knows” argument, ill ask a question. What other interesting examples can you think of where economists have proven through research a point that an average person on the street could be assumed to know intuitively already (or even better, that the populace was proving through their actions way before economists “proved” it)?

Expand full comment

No stockholder wants the company they own shares of to pollute rivers, kill humans with their products, falsify accounting numbers or commit fraud. In the case of Boeing (to which I own stock) I would prefer it put bolts on doors so they don’t fly off the plane. So let’s just say none of the above has anything to do with “being a good citizen”.

I don’t care if my stocks are in companies that recycle. I don’t care because there are federal, state and local laws that govern environmentalism. I don’t want my corporation to be concerned with social problems. That is the job of the government, specifically.

The job of companies with shareholders is to make them wealthier. To increase the value of their stock they hold. They do that by being well run companies who sell goods and services that other companies and people want. It really is not more complicated. They have no social duty to cure Aids or end Homelessness. Do I care that they give money to the local homeless shelter or philharmonic? Not really but it should serve a purpose. A big factory in a factory town, needs to be a good neighbor. That is just smart, the last thing you want is to have a bad standing with local political leaders. Charitable giving is fine but certainly not necessary. Make money and preserve the investment made by stockholders. That is the mission. The only mission.

Expand full comment

Yes, but every company invests in PR and branding as part of competition for customers and employees. The woke and BLM stuff was almost all an opportunistic PR exercise. Nobody is whiter than Tim Cook or Mark Zuckerberg. I haven’t noticed any declines in inequality in Woodside. As to being good neighbors- many of these companies have been HQ’d in “mixed” communities for many decades (Cupertino, Santa Clara, Mountain View, Redwood City). You’d think they could have invested in improving the school systems, public infrastructure, low cost apartments, etc, transforming these communities, rather than driving blacks and Latinos out and doing so little for those that remained. Shows you how much they don’t care. We shouldn’t take PR and branding too seriously.

Expand full comment
Apr 22·edited Apr 22

Cupertino has the best public school system in the country. Or had, but now the residents are too old and NIMBY to let new kids use it.

Tim Cook and Deirdre, the head of Apple HR are both gay.

> low cost apartments

Famously the whole problem is that local governments won't let you build low cost apartments even if you want to.

https://techcrunch.com/2014/04/14/sf-housing/amp/

Expand full comment
Apr 22·edited Apr 22

Nobody is whiter than Tim Cook. This is easily noted by any casual observer. I didn’t mention his or Zuck’s sexual preferences, favorite positions or relative dominance/submissiveness, but if you’re interested in rating these as some sort of consolation prize to not being African-American, have at it in your own post. Cant wait to see your points system! 😊

Remember that Cupertino today is not the Cupertino of the 1980s and 90s when there were few Asian residents. The school system being fairly decent today for California has to do with the massive Asian influx and pushing out of prior residents and little to do with generosity by Apple, IMO.

Expand full comment

For Boeing specifically, it wasn’t ESG or DEI alone that caused their quality problems, it was financial people taking over executive slots and pushing out people with engineering and quality assurance focus in favor of a profits ( to raise the stock price so the new execs can boost their own compensation). Look at the whistleblower reports (and don’t be too slow to do so, since some seem prone to “suicide” before depositions) that show factory worker incentives favored throughput rather than quality. This affected both the 787 Dreamliner and the 737.

The “redesign” of the 737Max to deal with oversized fuel-efficient engines without changing the airframe, but instead use a poorly designed software workaround can be blamed on cost-cutting/profit enhancement, but only if you overlook how the subsequent two fatal crashes have ruined their goodwill.

https://www.wsj.com/business/airlines/boeings-quality-complaints-mount-as-another-whistleblower-comes-forward-396db79b

Expand full comment

I lament my stock price. As I said, it is not acceptable to put bolts on a door so it doesnt’ fall off mid-flight.

Expand full comment

The G in ESG stands for governance, so getting rid of ESG would naturally lead to more of these problems - it literally means bad governance!

That's why "anti ESG" businesses were a brief fad but they all failed because they were, you know, scams.

Expand full comment
Apr 22·edited Apr 22

> No stockholder wants the company they own shares of to pollute rivers, kill humans ...

> I don’t want my corporation to be concerned with social problems

This appears so be a contradiction. A company makes many decisions, the government is not omnipotent, not every negative externality is covered by law.

Companies should be profit maximizing yes, but they should also think about the consequences of their actions.

I believe most of them do (for the most part). Companies are made up of people and owned by shareholders who live in the same world we do.

Expand full comment

Sorry I do not see the contradiction.

Expand full comment

It looks like the cap at UCSD is related to changing ones major, not for incoming students.

"Beginning in Summer 2025, currently enrolled students who want to switch into a selective major will be able to apply to selective majors once per year (between Summer and Fall quarters). The selection criteria for entry to the major will consider academic achievement in the specified screening courses and will also be aligned with UC San Diego’s priorities of serving California residents, first-generation college students, and students from low-income families. Continuing students who apply to switch to a selective major must have completed the required screening courses for that major and be in good academic standing. They will then be considered for the major using a point system that awards one point each for having a 3.0 GPA or higher in the major screening courses; California residency; Pell Grant eligibility; and first-generation college status (as determined by information received at the time of initial admission to UC San Diego).

Students with the highest number of points will be admitted until all available spaces within the major have been filled."

While it looks like spots are capped, this really isn't what the tweet is talking about and is I think somewhat misleading. I think your comments still hold true in that schools should be working to increase the sizes of those departments, but it's not like UCSD is telling the kids of college graduates that they can't be computer science majors.

Expand full comment

On Noah's first thing -- the decline of DEI/ESG activism as competition increases:

This may be a stretch, but I'm struck by the parallels in biology, where we find increasing incidence of flamboyant display among species in environments where predators are few, as with the colorations and mating displays of birds of paradise. The more competition for survival, the more focus on that one thing, and the less variation in behavior and performative display.

Expand full comment