Interesting People mailing list archives
YouTube, the Great Radicalizer
From: "Dave Farber" <farber () gmail com>
Date: Sun, 11 Mar 2018 16:18:43 -0400
Begin forwarded message:
From: Dewayne Hendricks <dewayne () warpspeed com> Date: March 11, 2018 at 11:49:50 AM EDT To: Multiple recipients of Dewayne-Net <dewayne-net () warpspeed com> Subject: [Dewayne-Net] YouTube, the Great Radicalizer Reply-To: dewayne-net () warpspeed com [Note: This item comes from friend David Rosenthal. DLH] YouTube, the Great Radicalizer By Zeynep Tufekci Mar 10 2018 <https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html> At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations. Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content. Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusively right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommender algorithm take me wherever it would. Before long, I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11. As with the Trump videos, YouTube was recommending content that was more and more extreme than the mainstream political fare I had started with. Intrigued, I experimented with nonpolitical topics. The same basic pattern emerged. Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons. It seems as if you are never “hard core” enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century. This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanation has to do with the nexus of artificial intelligence and Google’s business model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes. What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general. Is this suspicion correct? Good data is hard to come by; Google is loath to share information with independent researchers. But we now have the first inklings of confirmation, thanks in part to a former Google engineer named Guillaume Chaslot. Mr. Chaslot worked on the recommender algorithm while at YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performance. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues. The Wall Street Journal conducted an investigation of YouTube content with the help of Mr. Chaslot. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. If you searched for information on the flu vaccine, you were recommended anti-vaccination conspiracy videos. It is also possible that YouTube’s recommender algorithm has a bias toward inflammatory content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommended videos as well as its patterns of recommendations. He discovered that whether you started with a pro-Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommended. [snip] Dewayne-Net RSS Feed: http://dewaynenet.wordpress.com/feed/ Twitter: https://twitter.com/wa8dzp
------------------------------------------- Archives: https://www.listbox.com/member/archive/247/=now Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125 Unsubscribe Now: https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20180311161851:67F6F9C0-2569-11E8-AE02-DAED4B7B380B Powered by Listbox: http://www.listbox.com
Current thread:
- YouTube, the Great Radicalizer Dave Farber (Mar 11)