# The Impact of YouTube Algorithms on Personal Beliefs and Friendships
Written on
Understanding the Divide
I have a strong disagreement with a close friend regarding vaccinations; she identifies as an anti-vaxxer. Despite her intelligence and generally warm nature, she now shares unsettling conspiracy theory videos with me. These clips range from unfounded assertions that vaccines pose risks to claims about medical authorities implanting microchips. It’s evident that her intentions stem from a place of concern, particularly for those receiving the Covid-19 vaccine, which she firmly believes to be hazardous.
Engaging in dialogue with her has become increasingly difficult, as underlying tensions often lead to heated exchanges. Given the prominence of Covid-19 and vaccinations in our lives, every conversation now demands a mutual effort to avoid conflict. Interestingly, she hasn’t always held these views. The recent birth of her child, coupled with the pandemic, seems to have intensified her beliefs, with YouTube playing a significant role in this transformation.
YouTube's Influence
The arrival of her baby triggered a deep dive into YouTube, driven by fear for her child's well-being. She sought information on the potential—albeit nonexistent—dangers associated with vaccines. Fear can be an incredibly motivating force; it can lead individuals to consume content obsessively. YouTube's algorithm capitalizes on this fear, continually suggesting videos that keep users engaged and watching advertisements for extended periods. In her case, YouTube exploited her anxieties surrounding her child's health and directed her down a "rabbit hole" of alarming content.
The platform's revenue model thrives on user engagement. The more videos someone consumes, the more advertisements YouTube can display. Once a video concludes, YouTube's algorithm, powered by advanced artificial intelligence (AI), aims to keep viewers on the site. Since Google Brain took over the recommendation system in 2015, recommended videos have accounted for up to 70% of total viewing time, as reported by the Pew Research Center. Over 80% of users rely heavily on these recommendations.
The speed at which these algorithms operate has surpassed early expectations. In the past, it could take days for a user's viewing habits to influence future recommendations. Now, this process occurs within minutes or hours, as noted by Todd Beaupre, a product manager for YouTube’s discovery team.
A simple search on YouTube can lead to an array of content, from mundane topics like "how to tie my shoes" to extreme discussions about vaccine dangers. After watching a cooking tutorial, you might suddenly find yourself suggested videos linking culinary skills to conspiracy theories involving the Illuminati. The algorithm seems to have recognized that fringe content can be particularly captivating.
The Dangers of Information Overload
Information is abundant in today's digital landscape; however, it’s not the information itself that holds value, but rather the ability to navigate it effectively. In a world inundated with content, filtering out misleading or harmful material is crucial. Unfortunately, the drive to maximize engagement leads to algorithmic recommendations that can steer users toward dangerous content.
Consider walking into a bookstore and asking for a recommendation on car repair. The employee might suggest an excellent book that answers all your queries and sends you on your way. However, if the most popular book is not the best option, they might instead guide you to a sensationalized title that claims the automotive industry conspires to keep cars malfunctioning. This choice could lead to further visits, as you would need additional resources to address your initial concerns.
Vulnerability and Algorithmic Recommendations
How many individuals, facing legitimate fears, have been led astray by misguided recommendations and radicalization? Groups that exploit the vulnerable—such as anti-vaxxers, extremist organizations, and other fringe movements—can thrive on YouTube searches related to identity or health, benefiting from the platform's algorithmic biases.
The gradual loss of my friend has been disheartening. What began as innocent discussions has evolved into a chasm too wide to bridge. No matter what I say, she interprets it through a lens of skepticism, attributing my views to the influence of pharmaceutical companies. To her, the solution is simple: I must open my eyes and watch her preferred videos. The pandemic has strained relationships and fractured communities. We all desire resolution, yet for individuals like my friend, this is merely the beginning of a deeper conflict.
It's essential to acknowledge that YouTube is not solely to blame. However, its role in her radicalization has been significant. I often wonder how different things might have been if balanced sources had been recommended. Perhaps she would have turned away from YouTube altogether or found reassurance in credible information that addressed her fears from the outset.