Mozilla concludes that YouTube’s “dislike” button barely influences suggestions

0
0
youtube 1000x600.jpg
youtube 1000x600.jpg

An investigation by Mozilla shows that the “Dislike” button and other controls of Youtube they are ineffective when a user wants the platform’s algorithm to stop suggesting certain types of video. The foundation responsible for Firefox has concluded that such controls “prevent less than half of unwanted algorithmic recommendations.”

To carry out the research, in which 22,722 volunteers have participated, Mozilla has used the data collected through RegretsReporter, its extension that allows users to voluntarily “donate” their recommendation data to carry out studies like this one. With so many volunteers participating, the foundation has managed to base its report on millions of recommended videos, in addition to other anecdotal reports from thousands of people.

Mozilla tested the effectiveness of four different controls present on YouTube: the “I don’t like” button, which is the classic thumbs down and whose counter has not been shown by default for a long time; “I’m not interested”; “do not recommend the channel”; in addition to “delete the history of reproductions”. The researchers found that they had varying degrees of effectiveness in stopping the suggestion of certain content, but that at general levels the impact was “small and inadequate”.

Delving into the data, the most effective was “do not recommend this channel”, which prevented 43% of unwanted recommendations. At the opposite extreme is “I’m not interested” with a pyrrhic effectiveness of 11%. The classic “I don’t like” button is not that it goes much better with 12%while removing watch history improves performance to 29% effectiveness.

Apart from showing the rather ineffectiveness of YouTube’s controls, Mozilla’s report also shows that users are willing to go to great lengths to avoid unwanted recommendations by employing methods such as logging out and viewing content through a VPN. From the foundation they emphasize that the video platform should better explain its controls and provide more proactive ways for users to define what they want to see.

SEE ALSO  Do you need a new computer? These 5 clues show you that yes

Mozilla says that YouTube and similar platforms rely on a large amount of data that they passively collect to infer user preferences. Given this, the institution responsible for Firefox believes that this way of proceeding is somewhat paternalistic because it is the platforms that ultimately make the decisions about the content, when what should be done is to ask the users what they want to do on the platform.

Mozilla’s report on YouTube is not the result of an outburst, but rather a slipstream of the greater concern that there is around the algorithms used by the large platforms. In the European Union there is a Digital Services Law that requires platforms to explain how their recommendation algorithms work and open them up to external researchers, while similar initiatives are being promoted in the United States, although not necessarily as ambitious.

In short, the user’s activity, at least when they make the decision to reject, does not have much impact on the recommendations that YouTube ultimately makes. The only thing Mozilla has done is put numbers and specific data on something that the vast majority of YouTube users may have perceived.