Engineered Apathy
On Algorithms, Classifications, and Doomscrolling
By Stephanie Hasford
TLDR: Through digital classifications, algorithms predict and share what they think you want to see (I hate to use the word 'think' for a machine, but here we are). They reinforce feedback loops that alter our perception of ourselves and the world around us, leading us towards a more apathetic, docile society.
I’ve been Black for as long as I’ve been conscious. To others, at least, long before I had a word for it. That social category, born from colonialism, violence, and continuing oppression, was predefined for me before I was even a thought and has altered not only my understanding of self, but how I navigate and present in society. To keep a long and mildly traumatic story short, predetermined classifications have influenced and interacted with lived experience for centuries, influencing how we see ourselves and how others see us (like the analogy of a two-way mirror, or double-consciousness coined by W.E.B. Du Bois).
Unknown artist, n.d.
It’s taken 25 years, but I’ve mostly made peace with the social classifications assigned to me, and I’ve learned to navigate them. But there is something perverse and ever looming that I have not come to terms with: new classifications determined by machines, but with just as much potential to cause damage. Prescriptive black box algorithms wiggling their way into my psyche through social platforms and tech that control what I view and transform me and others into apathetic docile bodies, predictable through digital inputs.
iStock image
Spoken succinctly by Filterword author Kyle Chayka, “What does it mean to make a choice when the options have been so carefully arranged for us?” If social classifications are two-way mirrors, then new age screened-tech are black mirrors. They are portals to a digital world when on, but when off, a haunting reflection of our attachment to the predictive technology and our detachment from the real world– one that keeps moving, keeps becoming more bleak, while our attention is elsewhere.
Illustration by Butcher Billy
If you’re still reading this, then consider the black mirror worthy of your attention for a few more minutes while I dive into one of the ways social platforms we all know blend digitized identities with perception-altering technologies to impact lived identity.
If you felt my first few lines were a bit fear-mongering, you’re right. It’s probably because, aside from having general anxiety, I’m addicted to doomscrolling. And maybe you are too. According to Medical News Today, doomscrolling “describes when someone spends an excessive amount of time consuming large quantities of news or other content, particularly negative news, on social media and other websites.” The term became popular in 2020 during the COVID-19 pandemic, when social distancing alongside large-scale deaths and social injustices birthed the urgent need to stay tapped in and informed.
iStock images
iStock images
Before the emergence of powerful AI technology on social media platforms, doomscrolling was mildly harmful with wavering mental health concerns. Predictive algorithms take these negative byproducts and exponentiate them internally and externally, like a snowball rolling toward a fiery end. Let’s use TikTok as an example.
TikTok uses a recommendation algorithm to craft its globally known For You page. I don’t know how the algorithm works, and you probably don’t either (the New York Times thinks they do). Private companies often have proprietary algorithms to keep trade secrets close and confidential. According to TikTok, the recommendations on the For You page incorporate user interaction data (and video information from those interactions, like hashtags or captions), as well as device and account settings (which they report as the least important). Duration of engagement (watching a whole video versus a second of another) has the highest weight, capitalizing on our current attention economy and the brain’s reward system, even if the reward in this case is greater access to negative, depressing news.
Vecteezy image
Essentially, TikTok has categorized me using a seemingly infinite amount of data to predict what I want to see (even using data from the people around me). When they are right, and I keep engaging with the recommended content, it creates a feedback loop that reinforces to the algorithm that yes, I actually do want to see more mass shootings and global wars and dignity-crushing actions taken by my own government. Despite the negative impacts, I keep logging back on because I’m a human, and capitalists with engineer buddies have made it their mission to study psychology to manipulate the brain, so the product works as intended.
Design by Evelyn Mousigian
Not only do I keep engaging with the negative content, but it alters how I see the world and how I see myself as a participant in that world, with things being done to me and to others, and not with us. It also overloads my brain, and the body loves nothing more than homeostasis, so my brain eventually dulls its hyperactivity when seeing negative content, to not overwhelm me (also known as desensitization).
This is the side effect of the exposure to these algorithms and tech that I’m most concerned about: it's social detachment and apathy. It’s been linked to social platforms' algorithms, their overlord-esque execs, and their obsession with quantitatively learning us better than we can learn ourselves.
Made worse by our own government’s insistence on limiting regulations on AI, we are already nosediving into a polarized society that can’t even find a common ground to stand on. A people immobilized, stripped of free thinking and collective action by technology, in a time when more than ever, injustice is viewable almost the second after it happens.
On Fire by KC Green
If you think these thoughts make me a cynic or a pessimist, then you’re probably right. I was one before I got a smartphone at 16, and I’m still one now. I can’t blame that on social corporations and their tech. But I don’t think you would still be reading this if you didn’t feel the perverse ways in which these AI-powered social platforms classify us and strip more and more of what connects us to ourselves and to others. And I mean what connects us in genuine, non-capitalist-centered ways (I’ll leave what that is up to your interpretation).
I don’t have good enough solutions to share, and I’m not going to insult your intelligence by telling you to just get off the apps, or to simply practice hopescrolling instead. This is where you come in! I’m an activist misanthrope, and sometimes the knowledge of the doom empowers me more than it debilitates me. But more and more people are in the quicksand of the algorithms, and they are sinking further and further away from humanity, not even fully conscious of the ways the technology at the palm of their hands distorts reality and what they believe. I wrote this for all of us because we are so much more than what these algorithms have predicted us to be.
Imgur
------------------------------------------------------------------Additional Information------------------------------------------------------------------
Are social platform algorithms turning you into a misanthrope? Join the club.
Why are shady billionaires trying to take over the U.S. TikTok?
Wondering where my connection between socially constructed classifications and algorithms came from? Explore if you’re curious.