YouTube Recommendations Reinforce Negative Emotions: Auditing Algorithmic Bias with Emotionally-Agentic Sock Puppets
Abstract: Personalized recommendation algorithms, like those on YouTube, significantly shape online content consumption. These systems aim to maximize engagement by learning users' preferences and aligning content accordingly but may unintentionally reinforce impulsive and emotional biases. Using a sock-puppet audit methodology, this study examines YouTube's capacity to recognize and reinforce emotional preferences. Simulated user accounts with assigned emotional preferences navigate the platform, selecting videos that align with their assigned preferences and recording subsequent recommendations. Our findings reveal reveal that YouTube amplifies negative emotions, such as anger and grievance, by increasing their prevalence and prominence in recommendations. This reinforcement intensifies over time and persists across contexts. Surprisingly, contextual recommendations often exceed personalized ones in reinforcing emotional alignment. These findings suggest the algorithm amplifies user biases, contributing to emotional filter bubbles and raising concerns about user well-being and societal impacts. The study emphasizes the need for balancing personalization with content diversity and user agency.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.