'Fiction is outperforming reality': how YouTube's algorithm distorts truth
An ex-YouTube insider reveals how its recommendation algorithm promotes divisive clips and conspiracy videos. Did they harm Hillary Clinton’s bid for the presidency?
The methodology behind this story
by Paul Lewis in San Francisco
It was one of January’s most viral videos. Logan Paul, a YouTube celebrity, stumbles across a dead man hanging from a tree. The 22-year-old, who is in a Japanese forest famous as a suicide spot, is visibly shocked, then amused. “Dude, his hands are purple,” he says, before turning to his friends and giggling. “You never stand next to a dead guy?”
Paul, who has 16 million mostly teen subscribers to his YouTube channel, removed the video from YouTube 24 hours later amid a furious backlash. It was still long enough for the footage to receive 6m views and a spot on YouTube’s coveted list of trending videos.
'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia
The next day, I watched a copy of the video on YouTube. Then I clicked on the “Up next” thumbnails of recommended videos that YouTube showcases on the right-hand side of the video player. This conveyor belt of clips, which auto-play by default, are designed to seduce us to spend more time on Google’s video broadcasting platform. I was curious where they might lead.
The answer was a slew of videos of men mocking distraught teenage fans of Logan Paul, followed by CCTV footage of children stealing things and, a few clicks later, a video of children having their teeth pulled out with bizarre, homemade contraptions.
https://www.theguardian.com/technology/ ... orts-truth
Video sums it up.
- Similar Topics
- Last post
Users browsing this forum: No registered users and 55 guests