admin
Pinned March 14, 2020

<> Embed

@  Email

Report

Uploaded by user
YouTube’s tweaks to recommend fewer conspiracy videos seem to be working
<> Embed @  Email Report

YouTube’s tweaks to recommend fewer conspiracy videos seem to be working

Marc DeAngelis

March 03, 2020
 

YouTube's tweaks to recommend fewer conspiracy videos seem to be working | DeviceDaily.com

 

One of the most important aspects of YouTube is its recommendation engine, as the vast majority of views and watch time come from suggested content, rather than direct traffic. The platform does a good job of determining which videos would be relevant to a given user, but when it comes to news and fact-based videos, conspiracy theory content can find its way in. As of January of 2019 — and after facing public backlash — YouTube promised to curb the amount of conspiracy videos it pushes to users. A study published by the University of California, Berkeley states that these efforts do seem to be working, and that their analyses show a 40% reduction in the likelihood of YouTube suggesting conspiracy-based content.

The team trained its computers by having them analyze hundreds of conspiracy and non-conspiracy videos, helping them “learn” how to differentiate between the two. Once they were trained, the computers analyzed eight million recommended videos over the course of one year. Unfortunately, they were unable to classify the videos by actually watching them — instead they used transcripts, metadata (titles, descriptions and tags), the top 200 comments of each video and the perceived intent of those comments. Depending on the contents of each of these sources, the videos would be classified as either conspiratorial or not.

The researchers considered conspiracy theory videos as those which cover secret plots by those in power, ideas that are contrary to scientific consensus and views that are not backed by evidence or are unfalsifiable. These include both harmful conspiracies like those that state that mourning Sandy Hook family members are actually “crisis actors,” as well as the relatively innocuous ones, such as the assertion that the Mothman caused the collapse of the Silver Bridge in Point Pleasant, West Virginia.

The results show that there has been a dramatic reduction in the amount of conspiracy theory videos that YouTube serves to its users. However, the study did not use an actual user account during the analysis. Rather, all content was viewed in a “logged out” state. This may have had a major impact on results, as watch history is an important factor in how YouTube determines which videos to suggest to a logged-in user.

Diving into the content of the videos shows that YouTube may be selective in which conspiratorial subjects it suppresses. The rate at which climate change denial videos are being suggested hasn’t changed much, while the aforementioned Sandy Hook videos have been noticeably reduced. While leaving the judgement over what constitutes a harmful conspiracy to YouTube’s employees could be a problem, one could argue that suppressing conspiracy theory videos across the board could also be an issue. Non-harmful conspiracies can be fun — the strange antics of Andrew W.K. come to mind — and, after all, several conspiracy theories, such as MKUltra and Project Sunshine, turned out to be true.

On a broader level, these conspiracy theory videos are a problem because YouTube is becoming an increasingly popular source for news and facts — the platform has two billion active users every month — as well as its monopoly on the longform social video market. As such an active platform, bad actors are more frequently leveraging YouTube as a forum for social engineering, disinformation and misinformation.

Engadget RSS Feed

(34)


Top