YouTube: The algorithm recommends unwatchable videos


Artificial intelligence is used with excellent results in several sectors. Mozilla found that the one present on YouTube it works terribly. The algorithm used by Google suggests to users numerous videos that they would never want to see and that should not even be present on the platform because clearly violate the rules of the service. And give published at the beginning of April they seem very far from reality.

YouTube has an intelligence problem

YouTube has received several criticisms as artificial intelligence suggests videos with fake news, extremist content and disinformation in general. Mozilla therefore decided to develop the extension RegretsReport which allows you to evaluate the “damage” caused by the algorithm. After a 10-month search (July 2020 to May 2021), Mozilla released the report with shocking results.

Over 37,000 volunteer users used the extension to report 3,362 “deplorable” videos from 91 countries. The 71% of videos had been suggested by the YouTube algorithm, many of which did not respect the guidelines of the service. The problem is most evident in countries where English is not the primary language. Nearly 200 videos, which had 160 million views in total, were removed.

They are mentioned in the report several videos with violence and dangerous content, spam, sexual and fake news of all kinds, many of which related to the COVID-19 pandemic. Brandi Geurkink, Mozilla Senior Manager of Advocacy, said:

YouTube has to admit that its algorithm is designed to harm and misinform people. Our research confirms that YouTube not only hosts, but actively recommends videos that violate its own policies. We now also know that people in non-English-speaking countries are more likely to suffer the brunt of YouTube’s suggestion algorithm. Mozilla hopes these findings, which are just the tip of the iceberg, will convince the public and lawmakers of the urgent need for greater transparency in YouTube’s AI.

In the Mozilla report there are also some recommendations for YouTube, including the analysis of the algorithm by independent researchers, the disclosure of details on operation and adding an option to turn off personalized suggestions. Based on the published data from YouTube, only 16-18 out of 10,000 views are for videos that are prohibited by the guidelines.

A YouTube spokesperson has declared:

Over 80 billion pieces of information are used to improve our systems, including responses to user surveys on what they want to watch. We are constantly working to improve the YouTube experience, and in the last year alone we have introduced over 30 different changes to reduce suggestions for malicious content. Thanks to this change, borderline content consumption resulting from our recommendations is now considerably less than 1%.


Categories:   Digital Economy

Comments