• Corpus ID: 226237670

"It Is Just a Flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations

  title={"It Is Just a Flu": Assessing the Effect of Watch History on YouTube's Pseudoscientific Video Recommendations},
  author={Kostantinos Papadamou and Savvas Zannettou and Jeremy Blackburn and Emiliano De Cristofaro and Gianluca Stringhini and Michael Sirivianos},
YouTube has revolutionized the way people discover and consume videos, becoming one of the primary news sources for Internet users. Since content on YouTube is generated by its users, the platform is particularly vulnerable to misinformative and conspiratorial videos. Even worse, the role played by YouTube's recommendation algorithm in unwittingly promoting questionable content is not well understood, and could potentially make the problem even worse. This can have dire real-world consequences… 

Figures and Tables from this paper

An Audit of Misinformation Filter Bubbles on YouTube: Bubble Bursting and Recent Behavior Changes
The key finding is that bursting of a filter bubble is possible, albeit it manifests differently from topic to topic, and it is observed that filter bubbles do not truly appear in some situations.
VoterFraud2020: a Multi-modal Dataset of Election Fraud Claims on Twitter
Preliminary analyses of the data show that Twitter’s ban actions mostly affected a specific community of voter fraud claim promoters, and exposes the most common URLs, images and YouTube videos shared in the data.
Towards Continuous Automatic Audits of Social Media Adaptive Behavior and its Role in Misinformation Spreading
It is argued for continuous and automatic auditing of social media adaptive behavior and the application of weak supervision, semi-supervised learning, and human-in-the-loop techniques for automated data annotation.
Auditing Source Diversity Bias in Video Search Results Using Virtual Agents
It is found that source diversity varies substantially depending on the language with English queries returning more diverse outputs, and disproportionately high presence of a single platform, YouTube, in top search outputs for all Western search engines except Google.
Characterizing YouTube and BitChute Content and Mobilizers During U.S. Election Fraud Discussions on Twitter
While BitChute videos promoting election fraud claims were linked to and engaged with in the Twitter discussion, they played a relatively small role compared to YouTube videos promoting fraud claims, this core finding points to the continued need for proactive, consistent, and collaborative content moderation solutions rather than the reactive and inconsistent solutions currently being used.
Exposure to Alternative & Extremist Content on YouTube
The Belfer Fellowship was established by the Robert Belfer Family to support innovative research and thought-leadership on combating online hate and harassment for all. Fellows are drawn from the


Analyzing Disinformation and Crowd Manipulation Tactics on YouTube
YouTube, since its inception in 2005, has grown to become largest online video sharing website. It's massive userbase uploads videos and generates discussion by commenting on these videos. Lately,
Understanding the Incel Community on YouTube
It is found that the Incel community on YouTube is getting traction and that during the last decade the number of Incel-related videos and comments rose substantially, which is alarmingly high as such content is likely to share toxic and misogynistic views.
"You Know What to Do": Proactive Detection of YouTube Videos Targeted by Coordinated Hate Attacks
This paper proposes an automated solution to identify YouTube videos that are likely to be targeted by coordinated harassers from fringe communities like 4chan, and uses an ensemble of classifiers to determine the likelihood that a video will be raided with very good results.
Measuring Misinformation in Video Search Platforms: An Audit Study on YouTube
YouTube still has a long way to go to mitigate misinformation on its platform and a filter bubble effect, both in the Top 5 and Up-Next recommendations for all topics, except vaccine controversies; for these topics, watching videos that promote misinformation leads to more misinformative video recommendations.
A Longitudinal Analysis of YouTube's Promotion of Conspiracy Videos
A classifier for automatically determining if a video is conspiratorial is developed and a year-long picture of the videos actively promoted by YouTube is obtained to obtain trends of the so-called filter-bubble effect for conspiracy theories.
Disturbed YouTube for Kids: Characterizing and Detecting Inappropriate Videos Targeting Young Children
This work builds a classifier able to discern inappropriate content that targets toddlers on YouTube with 84.3% accuracy, and uses it to perform a first-of-its-kind, large-scale, quantitative characterization that reveals some of the risks of YouTube media consumption by young children.
Auditing radicalization pathways on YouTube
A large-scale audit of user radicalization on YouTube shows that the three channel types indeed increasingly share the same user base; that users consistently migrate from milder to more extreme content; and that a large percentage of users who consume Alt-right content now consumed Alt-lite and I.D.W. content in the past.
The Good, the Bad and the Bait: Detecting and Characterizing Clickbait on YouTube
A deep learning model based on variational autoencoders that supports the diverse modalities of data that videos include is devised that offers improved performance when compared to other conventional models.
Bias Misperceived: The Role of Partisanship and Misinformation in YouTube Comment Moderation
Investigating how channel partisanship and video misinformation affect the likelihood of comment moderation on YouTube finds that although comments on right-leaning videos are more heavily moderated from a correlational perspective, there is no evidence to support claims of political bias when using a causal model.
YouTube as a source of information on COVID-19: a pandemic of misinformation?
Over one-quarter of the most viewed YouTube videos on COVID-19 contained misleading information, reaching millions of viewers worldwide, highlighting the need to better use YouTube to deliver timely and accurate information and to minimise the spread of misinformation.