The “thumbs down” button and other tools to influence the YouTube algorithm are ineffective, according to a Mozilla study. Mozilla demands more control.
YouTube users have insufficient opportunities to influence the algorithm. This emerges from a study published by the Mozilla Foundation on Tuesday. According to this, the tools provided by Youtube make less than half of the unwanted recommendations disappear.
Mozilla has examined four options provided by YouTube that can help the algorithm manually: On the one hand, users can punish videos with the “thumbs down” button. They can also mark that they are not interested in a video or completely turn off all recommendations from a particular channel. Finally, Youtube allows you to remove a video from the playback history.
All of these tools influence future recommendations, but not enough, Mozilla concludes in its study. The data for this comes from the browser extension RegretsReporter, which Mozilla developed itself. Volunteers can install the extension to report unwanted YouTube videos to Mozilla to help with the investigation. 22,722 users are said to have used the tool, writes Mozilla in its study (pdf).
“Thumbs down” with low impact
By comparing it with a control group, Mozilla concludes how effective the individual tools for influencing the algorithm work. The “thumbs down” button reduced unwanted video recommendations by just 12 percent compared to the control group users who did not use algorithm tools. The “I don’t like the video” option was about the same at eleven percent.
In the test, for which more than 500 million videos were analyzed, the “Do not recommend any more videos from this channel” option was more effective. It filtered 43 percent of unwanted videos. When users manually removed a video from their watch history, they saw 29 percent fewer inappropriate video recommendations.
Criticism from Youtube
Mozilla defined YouTube recommendations for videos that are similar to videos that have already been rejected as inappropriate recommendations. In order to identify similar videos, Mozilla used human reviewers in addition to AI. Mozilla has prepared the results on a study website.
Youtube criticized the study by Mozilla in relation to the US technology magazine The Verge: The tools were deliberately designed in such a way that they did not block all content related to a topic. This is to prevent echo chambers from forming. Mozilla did not take this into account in its investigation.
Mozilla calls for more tools
In addition to the quantitative study, Mozilla surveyed 2,700 people directly about their experiences with the YouTube algorithms. Many of those surveyed complained about the lack of opportunities to influence the algorithm. Among other things, the users complained that the algorithm apparently did not remember the inputs used for long because after a certain time bad video recommendations had repeatedly appeared. Others reported a lot of work to avoid inappropriate recommendations: at least one person reported logging out of YouTube to watch certain videos.
The Mozilla Foundation is calling on YouTube’s parent company, Google, to provide users with stronger tools to influence the algorithm. In particular, users should also be able to proactively shape their experience, writes Mozilla. Youtube should also make it easier for research teams to study the algorithm.