Interesting People mailing list archives

The Poison on Facebook and Twitter Is Still Spreading


From: "Dave Farber" <farber () gmail com>
Date: Sat, 20 Oct 2018 19:49:23 +0900




Begin forwarded message:

From: Dewayne Hendricks <dewayne () warpspeed com>
Date: October 20, 2018 at 7:32:57 PM GMT+9
To: Multiple recipients of Dewayne-Net <dewayne-net () warpspeed com>
Subject: [Dewayne-Net] The Poison on Facebook and Twitter Is Still Spreading
Reply-To: dewayne-net () warpspeed com

The Poison on Facebook and Twitter Is Still Spreading
Social platforms have a responsibility to address misinformation as a systemic problem, instead of reacting to case 
after case.
By NYT Editorial Board
Oct 19 2018
<https://www.nytimes.com/2018/10/19/opinion/facebook-twitter-journalism-misinformation.html>

A network of Facebook troll accounts operated by the Myanmar military parrots hateful rhetoric against Rohingya 
Muslims. Viral misinformation runs rampant on WhatsApp in Brazil, even as marketing firms there buy databases of 
phone numbersin order to spam voters with right-wing messaging. Homegrown campaigns spread partisan lies in the 
United States.

The public knows about each of these incitements because of reporting by news organizations. Social media 
misinformation is becoming a newsroom beat in and of itself, as journalists find themselves acting as unpaid content 
moderators for these platforms.

It’s not just reporters, either. Academic researchers and self-taught vigilantes alike scour through networks of 
misinformation on social media platforms, their findings prompting — or sometimes, failing to prompt — the takedown 
of propaganda.

It’s the latest iteration of a journalistic cottage industry that started out by simply comparing and contrasting 
questionable moderation decisions — the censorship of a legitimate news article, perhaps, or an example of terrorist 
propaganda left untouched. Over time, the stakes have become greater and greater. Once upon a time, the big Facebook 
censorship controversy was the banning of female nipples in photos. That feels like a idyllic bygone era never to 
return.

The internet platforms will always make some mistakes, and it’s not fair to expect otherwise. And the task before 
Facebook, YouTube, Twitter, Instagram and others is admittedly herculean. No one can screen everything in the fire 
hose of content produced by users. Even if a platform makes the right call on 99 percent of its content, the 
remaining 1 percent can still be millions upon millions of postings. The platforms are due some forgiveness in this 
respect. 

It’s increasingly clear, however, that at this stage of the internet’s evolution, content moderation can no longer be 
reduced to individual postings viewed in isolation and out of context. The problem is systemic, currently manifested 
in the form of coordinated campaigns both foreign and homegrown. While Facebook and Twitter have been making strides 
toward proactively staving off dubious influence campaigns, a tired old pattern is re-emerging — journalists and 
researchers find a problem, the platform reacts and the whole cycle begins anew. The merry-go-round spins yet again.

This week, a question from The New York Times prompted Facebook to take down a network of accounts linked to the 
Myanmar military. Although Facebook was already aware of the problem in general, the request for comment from The 
Times flagged specific instances of “seemingly independent entertainment, beauty and informational pages” that were 
tied to a military operation that sowed the internet with anti-Rohingya sentiment.

The week before, The Times found a number of suspicious pages spreading viral misinformation about Christine Blasey 
Ford, the woman who has accused Brett Kavanaugh of assault. After The Times showed Facebook some of those pages, the 
company said it had already been looking into the issue. Facebook took down the pages flagged by The Times, but 
similar pages that hadn’t yet been shown to the company stayed up.

It’s not just The Times, and it’s not just Facebook. Again and again, the act of reporting out a story gets reduced 
to outsourced content moderation.

[snip]

Dewayne-Net RSS Feed: http://dewaynenet.wordpress.com/feed/
Twitter: https://twitter.com/wa8dzp





-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-a538de84&post_id=20181020064933:D24E053E-D455-11E8-B734-C7A29343588F
Powered by Listbox: https://www.listbox.com

Current thread: