22 Comments
User's avatar
Nunzio's avatar

As someone who was deep in youtube in the mid 2010s there was a huge wave of anti sjw postgamer gate content at the time. It was definitely a thing and it exposed me and many other young people to a sort of proto alt right view point. But it was only a few years. Between algorithm changes and hype cycle changes it’s not been that way for a long time now. That stuff is still out there but as the research you posted suggests, it doesn’t have the same viral potential as it did in the past

Expand full comment
Noah Smith's avatar

So here's the question: Was the viral potential back then really from the algorithm, or just because of where we were in American politics and culture? I think a number of young people got interested in rightist stuff in those years, and then got turned off after Charlottesville and the early Trump administration.

Expand full comment
Nunzio's avatar

Back then, the algorithm definitely favored clicks and comments more. It led to a lot of drama and controversy oriented channels of all kinds. The switch to a more view time oriented algorithm almost certainly has helped. But only youtube has the information to really answer your question (which is part of the problem).

Expand full comment
fredm421's avatar

Also - people grew out of alt-rightism, I think? I mean, where are Milo Yiannopoulos and Tomi Lahren nowadays? Long forgotten...

Expand full comment
Joseph Conner Micallef's avatar

I think people pretty dramatically underrate how good the YouTube recommendation algorithm is. You'd have to watch a lot of radical content relative to other content for a day or so to even START getting recommendations, and even then you might get one video per refresh. The simple reality is that in order to get a bunch of radical videos recommended you have to 1) seek out those videos deliberately or 2) basically never use YouTube and wander into them.

The idea that you'll get recommended radical videos for simply watching innocuous videos on channels that have radical videos is similarly wild. The YouTube algorithm is really good! When I watch compilations of Seinfeld clips it knows EXACTLY what I am looking for and not only does not recommend any other content a clip-making channel may make it doesn't even recommend Seinfeld content that is not clips of the show! And the algorithm gets better the more you use YouTube. I would say at any given time that 90% of the videos in my recommended are things I would enjoy? Maybe more?

Expand full comment
Noah Smith's avatar

Yes, I think Chen et al. put it well when they say that what YouTube was really doing was showing lots of far-right content to a small number of people who really wanted to consume a lot of far-right content.

The ethics of that are debatable (Should YouTube try to convert people away from far-right ideas?), but it's a very different thing than the "rabbit hole" hypothesis, in which unsuspecting normies get snared and converted into being right-wingers.

Expand full comment
Alex S's avatar

This seemed believable around 2016 (and it's one reason I never thought of working at YouTube despite their recruiters emailing me and calling me at work (not a humblebrag)), but I've heard people still use it for other subcultures on there - "vtuber rabbit hole" is a pretty common phrase.

And my own recommendations are extremely topical after only a little feedback to the algorithm; my front page is half rabbits half anime avatars. But it doesn't try to recommend me extreme polarizing rabbit content or anything.

In other news, apparently there's a left-NIMBY anime: https://www.animenewsnetwork.com/review/muteking/the-dancing-hero/12/.184624

Expand full comment
Noah Smith's avatar

Nooooooooooo

Expand full comment
Joseph Conner Micallef's avatar

It's genuinely impressive how you watch like four Hololive clips and the algorithm just goes "I gotchu" and starts throwing Hololive clips into your recommended. You would think watching a dozen videos titled like "Rabbit war criminal makes a bomb" would lead to some radical content, but nope just more Pekora.

Expand full comment
Brian T's avatar

I like to think of Sakura Quest as a YIMBY anime, given it's message of embracing change. :)

Expand full comment
Xiang Shi's avatar

There might be a survival bias. In early 2000s I guess more computer users were progressive left than conservative right. Bet then in 2010s smartphones and computers are accessible to almost everyone. So maybe, it is not because the social network made more far right/conservative people, but because far right/conservative people got their smartphones and social network accounts. I am not sure if there is any research about this.

At least we know there were many far right people (Nazi German, etc) before WWII where there was no internet. TV and radio were the ones to be blamed during that period.

Expand full comment
Matt Chester's avatar

You might not be able to prove that YouTube turns normal people into Nazis, but I don’t think one can deny that the mass proliferation of smartphones and the internet made left/right polarization worse. Extremist parties and candidates grew in popularity in basically every democratic country after 2010, and even some fairly undemocratic ones. I don’t see how you could explain that phenomenon, other than that those candidates were no longer suppressed and dismissed by mass media.

Expand full comment
ANoneinNY's avatar

You assumed that Kate Starbird was using media stories as her basis. She doesn’t say that. She says “family stories and self-reports”. She could be getting those directly in her own research. You’ve shown your own bias to your priors as well.

Expand full comment
Noah Smith's avatar

Where do you think those family stories and self-reports were told? They appeared in the media.

Expand full comment
ANoneinNY's avatar

Are you sure that she does not have direct responses from individuals in her own research? I did not read her Twitter response as dispositive either way. You apparently did.

Expand full comment
The Digital Entomologist's avatar

Anecdotal, but I still feel like it heavily promotes Jordan P****son to me even though I clicked "not interested" on a string of his videos one day. I also watched like 5 Jesus Christ Superstar clips one day and my ads suddenly turned into crazy stuff. At least I no longer get those Ben Shapiro DESTROYS Liberal Student or whatever.

Expand full comment
DxS's avatar

Every Twitter user can now get exposure to a breadth of specialist expertise that most 1980s professional reporters would have killed for. Also, more death threats in a week than most reporters back then got in a decade.

Media is very, very different now. That really ought to be changing things. But how?

Are all the media changes buried and confounded by the rest of our modern weirdness? Has the real power of YouTube and Twitter been hidden by the even greater force of Trump and the pandemic?

I don't believe Twitter, Facebook, YouTube, etc did nothing. I expect they've changed a lot. Yet I can't convince myself to any confident idea about the change they've made.

Expand full comment
Robert Ford's avatar

i lost my best friend to the youtube rabbit hole. i literally said "you've gone too far down the youtube rabbit hole" and we haven't spoken in a year.

Expand full comment
Kent Dickey's avatar

YouTube is constantly changing it's algorithm, not just in 2019, and the changes are not generally announced. Lots of changes happened around 2016-2017, too. I think the best you can say is we cannot really know if YouTube used to radicalize people. I also think your mental model of radicalization means YouTube can never radicalize people: you think if everyone sees enough videos, they'll become a terrorist. So if 100 people watch videos, and 2 become radicalized, that "proves" YouTube isn't the cause. Instead, a small number of people are vulnerable to radicalization, and exposure to it is what puts them over the edge. Those 2 wouldn't have been radicalized without the exposure. This is a very difficult area to make strong statements. In terms of censorship, that's not what YouTube and other social media need. What we need is for them to stop pushing harmful content to get clicks. It's like the difference between having Mein Kampf on a library shelf, and having librarians hand out copies of Mein Kampf as you enter the library, but only if you're a white male. The first is fine, the world has survived with that model for a long time. The second is new, and is a problem.

Expand full comment
fredm421's avatar

I remember the D&D moral panic. Also the pedophilia stuff that was linked via satanism (perennial winner, pedophilia panic. Conservative conmen know their memes!).

I really fucking wanted to gun down some conservatives back then! And we were only getting the after shock in France, not the full blast. Thank God for our tight gun regulations... :)

Expand full comment
Chaz's avatar

I must have a heavy degree of skepticism with the assertion that YouTube did not promote questionable content. I personally received recommendations for it regularly in the 2016's, although I admit it is a lot less nowadays.

I would regularly get recommendations to watch Jordan Peterson, or Ben Shapiro, or PragerU videos, despite my repeated insistence that I am not interested in this content. And also, whenever I watched any "meme culture" sort of video, there would also be progressively edgier and edgier shit in the recommendations, bordering on no longer even being jokes.

I don't think we "lost" a generation of men due to this effect, and I'm not even sure how many people take YouTube videos recommended to them as a serious source of information, but it definitely seemed to exist within the teenage male "meme culture" demographic, at least amongst myself and the people I hung out with.

Admittedly, I have not yet read through your linked studies so all I have to share is anecdotal evidence. But since you *also* shared anecdotal evidence I figure I might as well throw my story into the pot.

Expand full comment