YouTube is one of the most popular online platforms for watching videos, but it can also influence your political views. A study from UC Davis suggests that YouTube’s recommendation algorithm, which suggests videos to watch based on your previous choices, can play a role in political radicalization.
The study found that the algorithm can create a “loop effect” where users are exposed to more and more biased and potentially extreme political content. This can lead to polarization and radicalization for both right-wing and left-wing users.
The researchers assigned each video a score of -1 (far left) to +1 (far right) based on the ratio of left to right-wing accounts that shared the video on Twitter. They then simulated how users with different political preferences would interact with the algorithm and the videos it recommends.
They found that the algorithm tends to reinforce the user’s existing political leaning and push them towards more biased content. For example, if a user watches a lot of right-wing videos, the algorithm will recommend more right-wing videos and fewer left-wing videos. This can create a feedback loop where the user becomes more and more isolated from alternative viewpoints and more susceptible to radicalization.
“The system is doing what it’s supposed to be doing, but it has issues that are external to the system. Unless you willingly choose to break out of that loop, all the recommendations on that system will be zeroing on that one particular niche interest that they’ve identified. This can lead to partisanship and increases the divide that is facing American society,” said Muhammad Haroon, a computer science Ph.D. student at UC Davis, who led the study.
However, the study also found that the algorithm has some mechanisms that pull users away from political extremes. For example, the algorithm sometimes recommends popular videos that are not related to politics, such as music, sports, or entertainment. These videos can act as a “bridge” between different political groups and reduce polarization.
Another study published in PNAS Nexus found that the algorithm’s pull away from political extremes is asymmetric, meaning that it is stronger for users who watch far-right content than for users who watch far-left content. The study also found that the algorithm’s recommendations skew left even when the user does not have a watch history, suggesting that the algorithm has an inherent political bias.
These findings raise questions on whether recommendation algorithms should exhibit political biases, and the societal implications that such biases could entail. The researchers suggest that more research and potential regulation are needed to prevent political radicalization and polarization on online platforms like YouTube.
Relevant articles:
– YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest
– Do YouTube Recommendations Foster Political Radicalization?
– YouTube’s recommendation algorithm is left-leaning in the United States …
– Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization