A Longitudinal Analysis of YouTube's Promotion of Conspiracy Videos

March 06, 2020 ยท Declared Dead ยท ๐Ÿ› arXiv.org

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Marc Faddoul, Guillaume Chaslot, Hany Farid arXiv ID 2003.03318 Category cs.CY: Computers & Society Cross-listed cs.HC, cs.IR, cs.SI Citations 88 Venue arXiv.org Last Checked 2 months ago
Abstract
Conspiracy theories have flourished on social media, raising concerns that such content is fueling the spread of disinformation, supporting extremist ideologies, and in some cases, leading to violence. Under increased scrutiny and pressure from legislators and the public, YouTube announced efforts to change their recommendation algorithms so that the most egregious conspiracy videos are demoted and demonetized. To verify this claim, we have developed a classifier for automatically determining if a video is conspiratorial (e.g., the moon landing was faked, the pyramids of Giza were built by aliens, end of the world prophecies, etc.). We coupled this classifier with an emulation of YouTube's watch-next algorithm on more than a thousand popular informational channels to obtain a year-long picture of the videos actively promoted by YouTube. We also obtained trends of the so-called filter-bubble effect for conspiracy theories.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computers & Society

R.I.P. ๐Ÿ‘ป Ghosted

Green AI

Roy Schwartz, Jesse Dodge, ... (+2 more)

cs.CY ๐Ÿ› arXiv ๐Ÿ“š 1.5K cites 6 years ago

Died the same way โ€” ๐Ÿ‘ป Ghosted