Is your For You page curating you? Social media, political polarization, and distrust.
By Jenalyn Dizon
If you’ve ever felt yourself getting so riled up about the barrage of upsetting political news on your social media feed that you had to put your phone down and take some deep breaths, you’re not alone. While many current events certainly warrant this reaction, it’s no coincidence that millions of people see the same posts, add the same angry comments, and go on to start the same arguments at the Thanksgiving dinner table. And if you are a self-proclaimed “free thinker” who somehow heroically emerges from a scroll-session unaffected, I have bad news for you. Social media corporations, backed by political allies, control your social media feed to foster political polarization. Because angry people 1) build a loyal voter base and 2) drive online engagement.
@mohamed_hassan on Pixabay
A lot can be predicted about you by your search, click, follow, and ‘like’ data, not to mention the personal data you voluntarily provide in your profiles like age, gender, education level, class, state/city of residence; which in combination give Meta (or whomever they sell your data to) everything they need to know about your political beliefs and partisan affiliations. So if most of your mutuals follow Joe Rogan, or you’ve watched one of his videos before, the algorithm will likely put some of his content on your Explore page too, along with maybe a local Republican candidate, Fox News, or Turning Point USA (see Instagram’s recommendation algorithm).
This is perhaps an extreme example, but it's my instinct that there’s no where else to go but up – if you’re not interacting enough (i.e. making Meta enough $$$) with your current intensity of political content, or even if you are interacting and they want you to do so even more, you will keep being suggested more provocative, inflammatory content until you get hooked or manually express that you are “not interested.” While this content might simply function as rage bait (Oxford’s word of the year, fun fact) for some, it can reel in those it resonates with, creating an affective feedback loop offering esteem and belonging to people feeling slighted and unheard by their social world. In this way, your predicted beliefs become realized and intensified by that classification.
@TheDigitalArtist on Pixabay
And this is not exclusive to extreme right- or left-wing echo chambers. Research that manipulated the order of people's social media feeds – not even the content they would naturally get, just the order of presentation – resulted in an average 2-point attitude change more negatively or positively to the opposing political party depending on if they had increased or decreased exposure respectively to antidemocratic attitudes and partisan animosity. Which is a lot – what you would see over 2 years in the US as a whole.
Tiziano Piccardi, et al. “Reranking Partisan Animosity in Algorithmic Social Media Feeds Alters Affective Polarization.”
As discussed earlier, the algorithm will do as the algorithm does, and these angry, rage-baity posts are often promoted simply to maximize engagement. But in addition to the perhaps unintentional way social media algorithms foster political polarization, those with the power and access to the massive amounts of data on our digital identities have plenty of incentives to divide us as well (TIL the real meaning of divide and conquer; to quote A Bug’s Life: You let one ant stand up to us, and they all might stand up).
Philip II of Macedon, source unknown
They want to turn you – complex, thoughtful, multi-issue voter – into a staunch left or right supporter for their benefit, manipulating your belief systems for social control. Billionaires and massive corporations profiling you, shaping your opinions, and sending you out to vote for their candidate and thus materially shaping entire social and institutional systems.
This claim might be a reach for some, but the evidence is clear. Cambridge Analytica illegally scraped Facebook user data to profile them and run political ads for DJT and Ted Cruz in 2016. Though they did not show motives of political polarization so conspicuously, Trump himself famously engages in aggressive rhetoric and near calls to violence on social media, which has fueled the same culture of animosity among his followers online (and IRL).
Josh Larios on Flickr
TikTok policy currently prohibits paid political advertising, though this has not been fully enforced in the past. On other platforms, even sanctioned political ads are prime opportunities for fear mongering and spreading misinformation, often intended to sow distrust in other people and groups of people around us. And when we’re taught to fear people who are evil and out to get us, even if our lived experience lends no credence to this fear, we look to our fearless leaders to protect us from this perceived and self-causing threat (affective facts, according to Massumi).
As a result of all of this polarization, we’ve grown unwilling and even disincentivized from attempting to connect across ideological lines. I’m not suggesting it’s always appropriate to just forgive and forget, but I wonder how our society might ever move forward while the puppeteers are constantly dragging us back. As politicians and Big Tech CEOs shape our lives through the screen and hate crimes and political violence overwhelm our feeds, we need at least an awareness of the ways we are being monetized, manipulated, and turned against each other to rebuild our divided selves and communities.