27 Comments
User's avatar
Yoshi Tryba's avatar

"But the people who run social media companies should think about tweaking their algorithms to short-circuit this process of escalating outrage and the triumph of mini-demagogues."

Facebook researches know about this whole phenomena and actively tweaked their algorithms to increase the visibility and reach of angry content. They do this, surprise surprise, to increase profits.

Expand full comment
c1ue's avatar

This is old hat: anger = more engagement = more eyeballs/clicks/content

Expand full comment
Terry P's avatar

Neither Twitter or Facebook will willingly undermine engagement on their platform. They are optimizing for profitability, irrespective of the impact on society. Only regulation or consumers abandoning their platform will cause them to change. The broader media is also to blame here for picking up twitter comments and representing them as news, further increasing their reach.

Expand full comment
Scott Monty's avatar

Twitter is great at giving iregasms.

Expand full comment
David Rothberg's avatar

Thank you. To me it's the most critical issue -- fundamental to how majority of other issues are resolved. As Spinoza said, nothing good can come from negative emotions. Humans will evolve. Pieces like this abet the process.

Expand full comment
Richard Treitel's avatar

I'd better not be the first person to draw this connection, but tribal outrage can push people towards single-issue voting. Which would be OK, if only they were choosing a single-issue government.

I wonder. Is there some way to channel the outrage away from elections?

Expand full comment
Sineira's avatar

So, just like we used to ignore the ranting morons at the pub we can now ignore the ranting morons on twitter.

Expand full comment
nei's avatar

The problem with this line of thinking is that these companies are just giving the people what they want. People love drama and conflict, and watching others get roasted in public? Why do you think pro wrestling is popular? Or people magazine? I think a version of twitter where everyone gets along most of the time is one that people ultimately wouldn’t use as much.

Expand full comment
Robert Ford's avatar

I largely agree with all this but I'm not sure I'm one hundred percent satisfied with the thoroughness of their "jerks are jerks" finding in the paper cited. For example, I'm a lot meaner online that I am in person. "Online" is my chance to stop being overly-polite like I am in real life and have some real discussions!

Expand full comment
Michael's avatar

People do get an immune system to this stuff. If you look at communities people who were very online in like 2014, a lot of the outrage tactics worked then but don’t work now. Twitter is still a mess but the most egregious shouting and harassment doesn’t seem to stick.

If that’s true, then maybe the apparent explosion of online anger is really just new people coming online who can still be easily manipulated. An “eternal September” sort of thing.

If true this would at least mean the platforms don’t have to solve a hard socialtechnical problem, although it also means they have a direct financial incentive (grow new users) to make the shouting worse.

Expand full comment
Garen Checkley's avatar

Yeah the metaverse is gonna need some new architects before I move in…

Expand full comment
Kent Dickey's avatar

An online forum which promotes or advertises alongside user content needs to be held responsible for that content. It's pretty obvious an easy way to make money is to let people post outrageous things, and then profit from people fighting about it. Let's remove the profit motivation, since Mark Zuckerberg has shown he will pick profits over everything. This is actually easy to do: make it so if a site promotes and profits from any content, the site owner is legally responsible for that content. You still get liability protections if you do not promote, or do not advertise. So a forum with comments ordered by date is allowed to have ads. Twitter is a problem, but it's just much smaller and so has less real-world impacts. Social media is primarily responsible for the Covid vaccine refusal, several genocides (Rohingya was the first I was aware of), the Jan 6 insurrection, and general "tribalism". Facebook is a clear net negative for the world, Twitter is a little less clear. We'll have lost more Americans to Facebook than to all wars soon. Moderation and other fixes requiring the sites to do something will never fix the problem--these are actually promoted by Facebook to avoid losing their profit. But if Facebook is legally responsible for the libel, child porn, and copyright violations on its site--it will get cleaned up overnight, and fix the shouting class at the same time.

Expand full comment
Billy Easley's avatar

"We'll have lost more Americans to Facebook than to all wars soon."

I am critical of Facebook's policy choices but this is a crazy statement to make. You can't attribute every COVID death in the US to Facebook or even discrete events - it's not even clear how much of the organizing for January 6th happened on FB. We just don't know yet.

Also, this is a common policy assumption I see: if we just changed the content moderation laws and remove the profit motive then all the problems on the internet works cease. But there are major costs to this approach - like how would other new companies challenge and displace FB within this new regulatory landscape? It would be a lot more difficult.

Expand full comment
Kent Dickey's avatar

These comments are clearly useless, Noah has left obvious spam up for 24 hours.

It's pretty clear social media is to blame for Covid vaccine hesitancy. Where else is Ivermectin being pushed? So, it's pretty conservative to blame 70% of all US deaths after June 1 on vaccine hesitancy. And about 70% of hesitancy is social media, so ~50% of all US Covid deaths since June 1 are due to social media. I could not get this number easily, but eyeballing graphs, I'd estimate total US deaths to Covid at 70,000 since June 1. So 35,000 are due to social media. If we try to calculate people influenced to believe "Covid is just the flu", the numbers go up. And these are not the only deaths Facebook causes, and Covid is not over.

After a few years, studies show that Facebook did contribute to the Rohingya genocide: https://en.wikipedia.org/wiki/Rohingya_genocide#Facebook And since it was years ago, nothing was done. Your attitude will lead to no change at Facebook/other social media evil doers ever, no matter how much damage they cause.

The world did just fine without Facebook, and will do so again. Facebook pushes "we just need better moderation" since that's its preferred way to prevent competitors. But moderation jobs are dehumanizing (they cause PTSD), too, and I don't want to subject people to that.

Current profit-driven engagement-based social media is like leaded gasoline. We will all be better off without it.

Expand full comment
Newf's avatar

There's also the issue that after you've accumulated social status (followers) by shitposting in favour of one political tribe, you're disincentivized from posting content that your chosen tribe would disagree with. You essentially get trapped into a position and cannot publicly change your opinions without getting dogpiled or losing status (followers).

The social media companies will never fix this issue by themselves. The changes that would be required to eliminate the problems discussed in this article would collapse engagement and reduce monthly visitors.

Government intervention is the only reasonable solution. Removing likes/thumbs up/etc and making follower counts private would probably significantly reduce social media extremism. Algorithm reform would also be helpful.

Expand full comment
A Special Presentation's avatar

The social media platforms that came out of what was called Web 2.0 in the old days turned out to be based on very misguided theories about human interaction that were exactly what you'd expect to be held by a bunch of naive, middle to upper class white engineers in California. Unfortunately, I think they're too embedded in society for most people to leave behind.

I honestly think there's a historical parallel with alcohol in the 1800s. In either case, Americans are drinking/posting too much, and it's negatively impacting their health and that of all aspects of society around them. However, prohibition was an over-correction and deeply unpopular and unsustainable in the long run. I think what's needed is a new movement based on social media "temperance". For example, a lot of early temperance advocates weren't prohibitionists but advocated that people give up the hard stuff and consume beer and wine in moderation instead. Social media temperance leaders could advocate for rules like reading an article before posting or resharing, not getting into arguments on comments pages with people you don't know and whose epistemological context you don't understand, like how we have cultural rules and guidance associated with alcohol (don't drink alone, before 5 PM, etc.)

Expand full comment
Dagon's avatar

The notion of "mob mentality" as described here is not that supported by the evidence in general.

The first two factors are much more important because they often combine and make essentializing points the premiere way to consume content on social media.

People have a plethora of content from various personality to consume but have a limited number of facets they can attribute to a single personality or topic.

Expand full comment
Rafael Kaufmann's avatar

Even worse: by now pretty much everyone has had exposure to the tribalizing social networks and will carry over the nasty viral memes and subsequent constant outrage to their daily lives. Very hard to deradicalize. This is why mindfulness is an urgent public health issue.

Expand full comment
Greg Barnhill's avatar

What if I told you it was like this even before most people not attending a university had access to the internet?

Because on IRC and usenet forums, flame wars and trolls were common. Don't get me wrong, there was still a little room for rational exchanges and niche academia stuff back then, but around the time usenet opened the unmoderated alt.* newsgroups, everything went straight to hell and fast.

So the warning was out there before most folks even had access to the internet.

It was just a matter of time before somebody made it so easy that anybody could troll and instigate flame wars, and so on. And lowering the bar to entry, oddly enough, did not improve things at all.

Expand full comment
KetamineCal's avatar

I've been online since the BBS days. I think moderators back then kept a tighter ship, especially with fewer users to manage. You're absolutely right that unmoderated groups have always been impossible to self-police.

Expand full comment
A Special Presentation's avatar

Yeah, and I also think that there's something to the ethos of "if a community is too big to be effectively moderated by humans, it's too big". I'm thinking back to the Something Awful forums in the 2000s, which were toxic in very many ways but had a culture that was strictly enforced by the mod team. It's impossible to do with Facebook or Twitter in a generalized way.

Expand full comment
Alex S's avatar

SA probably didn't have enough moderation, because it ended up feeling like being trapped in a box with everyone worshipping the few mods as kings, while they just enforced whatever they wanted.

Expand full comment
A Special Presentation's avatar

I think a lot of that has to do with the incentives associated with managing growing vs. shrinking communities. SA grew rapidly throughout the 2000s and I think peaked at slightly under 100,000 members before 2010, and began to shrink afterwards. I think that as a mod if you have a growing community you're focused on welcoming newcomers who might add new perspectives and viewpoints while helping and coercing them into assimilating into the existing culture. If you have a shrinking community, it becomes much more about managing the existing community's discomfort with anything and anybody new and maintaining whatever existing comfort there is while slowly going from enforcing community standards to managing an editorial voice. The same thing happened to Metafilter, another great anglophone Web 1.0 community.

(It could also be that SA was too big, too.)

Expand full comment