Auditing radicalization pathways on YouTube

@article{Ribeiro2020AuditingRP,
  title={Auditing radicalization pathways on YouTube},
  author={Manoel Horta Ribeiro and Raphael Ottoni and Robert West and Virg{\'i}lio A. F. Almeida and Wagner Meira Jr},
  journal={Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency},
  year={2020}
}
Non-profits, as well as the media, have hypothesized the existence of a radicalization pipeline on YouTube, claiming that users systematically progress towards more extreme content on the platform. [] Key Result Overall, we paint a comprehensive picture of user radicalization on YouTube.
YouTube, The Great Radicalizer? Auditing and Mitigating Ideological Biases in YouTube Recommendations
TLDR
A systematic audit of YouTube's recommendation system finds that YouTube’s recommendations do direct users – especially right-leaning users – to ideologically biased and increasingly radical content on both homepages and in up-next recommendations, but bias can be mitigated through an intervention.
Algorithmic Extremism: Examining YouTube's Rabbit Hole of Radicalization
TLDR
YouTube's recommendation algorithm is shown to favor mainstream media and cable news content over independent YouTube channels with slant towards left-leaning or politically neutral channels, suggesting that it fails to promote inflammatory or radicalized content.
Evaluating Platform Accountability: Terrorist Content on YouTube
YouTube has traditionally been singled out as particularly influential in the spreading of ISIS content. However, the platform along with Facebook, Twitter, and Microsoft jointly created the Global
Are Anti-Feminist Communities Gateways to the Far Right? Evidence from Reddit and YouTube
TLDR
This paper quantitatively studies the migratory patterns between a variety of groups within the Manosphere and the Alt-right, a loosely connected far right movement that has been particularly active in mainstream social networks.
What is BitChute? Characterizing the "Free Speech" Alternative to YouTube
TLDR
The results suggest that BitChute has a higher rate of hate speech than Gab but less than 4chan, and while some BitChutes content producers have been banned from other platforms, many maintain profiles on mainstream social media platforms, particularly YouTube.
A Longitudinal Analysis of YouTube's Promotion of Conspiracy Videos
TLDR
A classifier for automatically determining if a video is conspiratorial is developed and a year-long picture of the videos actively promoted by YouTube is obtained to obtain trends of the so-called filter-bubble effect for conspiracy theories.
COVID-19 Induced Misinformation on YouTube: An Analysis of User Commentary
Several scholars have demonstrated a positive link between political polarization and the resistance to COVID-19 prevention measures. At the same time, political polarization has also been associated
Infodemics on Youtube: Reliability of Content and Echo Chambers on COVID-19
TLDR
A massive data analysis on YouTube provides evidence for the existence of echo chambers across two dimensions represented by the political bias and by the trustworthiness of information channels and observes that the echo chamber structure cannot be reproduced after properly randomizing the users’ interaction patterns.
Subscriptions and external links help drive resentful users to alternative and extremist YouTube videos
Do online platforms facilitate the consumption of potentially harmful content? Despite widespread concerns that YouTube’s algorithms send people down “rabbit holes” with recommendations to extremist
Auditing the Biases Enacted by YouTube for Political Topics in Germany
TLDR
This research presents a meta-politics of information and media policy that aims to explain the current state of public opinion and formulate a strategy to address this state of affairs.
...
...

References

SHOWING 1-10 OF 55 REFERENCES
Analyzing Right-wing YouTube Channels: Hate, Violence and Discrimination
TLDR
Analysis of issues related to hate, violence and discriminatory bias in a dataset containing more than 7,000 videos and 17 million comments shows that right-wing channels tend to contain a higher degree of words from "negative'' semantic fields.
Disturbed YouTube for Kids: Characterizing and Detecting Disturbing Content on YouTube
TLDR
This work develops a classifier able to detect toddler-oriented inappropriate content on YouTube with 82.8% accuracy, and uses it to perform a first-of-its-kind, large-scale, quantitative characterization that reveals some of the risks of YouTube media consumption by young children.
Mining YouTube to Discover Extremist Videos, Users and Hidden Communities
TLDR
A semi-automated system to assist law enforcement and intelligence agencies dealing with cyber-crime related to promotion of hate and radicalization on the Internet and a systematic study of the features and properties of the data and hidden social networks which has implications in understanding extremism on Internet are described.
The web centipede: understanding how web communities influence each other through the lens of mainstream and alternative news sources
TLDR
The results indicate that alt-right communities within 4chan and Reddit can have a surprising level of influence on Twitter, providing evidence that "fringe" communities often succeed in spreading alternative news to mainstream social networks and the greater Web.
Disturbed YouTube for Kids: Characterizing and Detecting Inappropriate Videos Targeting Young Children
TLDR
This work builds a classifier able to discern inappropriate content that targets toddlers on YouTube with 84.3% accuracy, and uses it to perform a first-of-its-kind, large-scale, quantitative characterization that reveals some of the risks of YouTube media consumption by young children.
Down the (White) Rabbit Hole: The Extreme Right and Online Recommender Systems
TLDR
The evidence presented in this article supports a shift of the almost exclusive focus on users as content creators and protagonists in extremist cyberspaces to also consider online platform providers as important actors in these same spaces.
A focused crawler for mining hate and extremism promoting videos on YouTube.
TLDR
A focused-crawler based approach consisting of various components performing several tasks: search strategy or algorithm, node similarity computation metric, learning from exemplary profiles serving as training data, stopping criterion, node classifier and queue manager is presented.
Digital Discrimination : The Case of Airbnb
Online marketplaces often contain information not only about products, but also about the people selling the products. In an effort to facilitate trust, many platforms encourage sellers to provide
On the Origins of Memes by Means of Fringe Web Communities
TLDR
This paper detects and measure the propagation of memes across multiple Web communities, using a processing pipeline based on perceptual hashing and clustering techniques, and a dataset of 160M images from 2.6B posts gathered from Twitter, Reddit, 4chan's Politically Incorrect board, and Gab, over the course of 13 months.
Digital Discrimination: The Case of Airbnb.com
Online marketplaces often contain information not only about products, but also about the people selling the products. In an effort to facilitate trust, many platforms encourage sellers to provide
...
...