Mozilla researchers analyzed seven months of YouTube activity from more than 20,000 participants to assess four ways YouTube says people can “adjust their recommendations” – DislikeAnd the not interestedAnd the erase from historyor Do not recommend this channel. They wanted to know how effective these controls really were.
Each participant installed a browser extension that added a file stop recommending On top of every YouTube video they’ve watched, as well as those in the sidebar. Hitting it triggered one of the algorithm’s four tuning responses at a time.
Dozens of research assistants then monitored the disapproved videos to see how similar they were to tens of thousands of subsequent YouTube recommendations to the same users. They found that YouTube controls had a “negligible” effect on the recommendations participants received. Over the course of the seven months, one disapproved video generated, on average, about 115 bad recommendations — videos very similar to the ones participants told YouTube they didn’t want to watch.
Previous research suggests that YouTube’s practice of recommending videos that you are likely to approve and rewarding controversial content can sharpen people’s opinions and lead them to political radicalism. The platform has also been repeatedly criticized for promoting sexually explicit or suggestive videos children—Pushing content that violates its policies to spread. After scrutiny, YouTube vowed to combat hate speech, better enforce its guidelines, and not use its recommendations algorithm to promote “borderline” content.
However, the study found that content that appeared to violate YouTube’s own policies was still actively recommended to users even after negative feedback was submitted.
Hit DislikeThe most obvious way to give negative reviews, only 12% of bad recommendations stopped; not interested Only 11% stopped. YouTube advertises both options as ways to fine tune the algorithm.
“Our controls do not filter entire topics or viewpoints, as this can have negative effects on viewers, such as creating echo chambers,” says YouTube spokeswoman Elena Hernandez. Hernandez also says that Mozilla’s report does not take into account how the YouTube algorithm actually works. But that’s something no one outside YouTube really knows, given the billions of inputs to the algorithm and the company’s limited transparency. The Mozilla study attempts to look into this black box to better understand its outcomes.