This article was originally published on Medium. This copy is provided for accessibility and archival purposes. The canonical version remains on Medium.
Facebook’s Political Algorithm and Extremism Silos
May 26, 2020
The social media echo chamber problem is strengthened by a simple algorithm mistake which Facebook and seemingly all of the social media platforms have baked into their cores. The Wall Street Journal recently shed light on part of the problem —Facebook chases “engagement,” by feeding folks more and more “sticky” extreme and enraging content, to keep them glued to the Facebook platform — and Facebook’s ads longer. Facebook categorically can’t fix this problem, because it’s at the core of their entire business model — which is likely why they chose to do nothing.
But there is a second side of the problem, and I’d argue a more important part— the “silo” problem. Facebook (and other social media) ranks every post on a political spectrum from left to right on a 5 step scale. They rank each user that way too. You can see how they rank you in your profile’s “Ad Preferences,” if you’re curious. They use these values to feed us posts that fall within those parameters. Before social media, polls showed folks were often had strong opinions on single issues. Often they held points of view that didn’t align completely with a party. One might be anti-abortion, yet also anti-gun for example (even if only one or the other motivated them to the polls — so-called single issue voters).
That happens less and less now, and I believe social media plays a role in this siloing effect, because these platforms are essentially sorting issue propaganda in two consistent “silos” on a simple 5 step spectrum. This drives partisan polarization on groups of issues together, rather than on single issues. Where we used to have a set of different single issue groups, that parties sort of gather in to a single coalition, now there are two large groups of people, who all think and understand an entire set of issues the same way — often with their own whole vocabulary to describe their set.
Together with the propensity to promote partisan extremism on social media platforms, we have a dangerous mix. This is society breaking, and is rife for foreign and domestic actors to manipulate and exploit. Perhaps unlike the business model of promoting extremism, Facebook, Twitter, and the others CAN address THIS issue , they just have to be less lazy about how they categorize users and content. It’s simple, instead of following the largely illusory distinction between “left” and “right,” for everything (effectively building the previously fake distinction), rank content and users on a scale for each individual issue.
The overly simplistic political curation mistake at the heart of Facebook and other social media platforms, is driving us to two increasingly extreme partisan camps. Fixing it is relatively easy, and could help steer us back to moderate sanity, even if it would not address the extremism problem. However, maybe if users aren’t constantly propagandized in echo chamber silos, it’ll be easier to find alternatives to promoting more and more extreme, enraging content to keep users engaged. Or maybe a turn to a different business model is called for. I’d be happy to opt-out of highly targeted political propaganda for $12 a year. More importantly, I’d happily pay again to opt my grandparents out…