Interesting People mailing list archives

The Internet's "Nazi Purge" Shows Who Really Controls Our Online Speech


From: "Dave Farber" <farber () gmail com>
Date: Tue, 22 Aug 2017 08:45:52 -0400




Begin forwarded message:

From: Dewayne Hendricks <dewayne () warpspeed com>
Date: August 22, 2017 at 8:35:03 AM EDT
To: Multiple recipients of Dewayne-Net <dewayne-net () warpspeed com>
Subject: [Dewayne-Net] The Internet's "Nazi Purge" Shows Who Really Controls Our Online Speech
Reply-To: dewayne-net () warpspeed com

The Internet's "Nazi Purge" Shows Who Really Controls Our Online Speech
You might not worry about companies censoring Nazis. But you should be worried about the unelected bros of Silicon 
Valley being the judge and jury.
By Jillian York
Aug 21 2017
<https://www.buzzfeed.com/jillianyork/silicon-valleys-nazi-purge>

The Daily Stormer’s unceremonious booting from large swathes of the internet has made plenty of headlines; tech 
companies, the story goes, are “joining the resistance.” Silicon Valley is conducting a “Nazi purge,” and 
Charlottesville is “reshaping the fight against online hate.”

But the demise of this hateful website has also raised a new debate about an old problem: Silicon Valley’s control of 
our online speech.

Companies like Facebook and Twitter have been making hard decisions about hate speech for a long time. These 
platforms, as well as web-hosting companies and other intermediaries, are not governed by the First Amendment. 
Instead, they must obey 47 U.S.C. § 230, known colloquially as “CDA 230.” This gives them immunity from liability for 
most of the content they host, and says they are free to host (or not host) whatever they want.

Those rights are important, but they also come with great responsibility. And I believe these companies are failing 
to live up to that responsibility.

The truth is companies get these decisions wrong a lot of the time. And because they’re not transparent about how 
their rules are enforced or about how much content is taken down, we only hear about the bad decisions when they make 
headlines. That is happening increasingly often these days, as those in media circles take more interest in the issue.

Just this summer, Facebook used its hate speech policies to censor queer artists and activists for using words like 
“dyke” and “fag”; Twitter booted several leftist activists, apparently for engaging in uncivil counterspeech; and 
YouTube’s algorithms deleted masses of videos from the Syrian civil war that activists had archived for use in war 
crimes investigations.

This is nothing new. Over the years, I’ve watched as Silicon Valley companies have made globally important decisions 
that have stirred less debate than this week’s Daily Stormer episode. Last year, when Twitter boasted that it deleted 
235,000 “terrorism-related” accounts from their service, hardly anyone blinked. But in that case, as in this one, we 
need to ensure that these companies are accountable to their users, and that people have a path of recourse when they 
are wronged.

I’m not so worried about companies censoring Nazis, but I am worried about the implications it has for everyone else. 
I’m worried about the unelected bros of Silicon Valley being the judge and jury, and thinking that mere censorship 
solves the problem. I’m worried that, just like Cloudflare CEO Matthew Prince woke up one morning and decided he’d 
had enough of the Daily Stormer, some other CEO might wake up and do the same for Black Lives Matter or antifa. I’m 
worried that we’re not thinking about this problem holistically.

In the case of the Daily Stormer, companies were undoubtedly very aware of the site’s presence on their platforms and 
made not just a moral decision, but a business one as well. But that’s not how content moderation typically works: In 
most instances, companies rely on their users to report one another. The reports enter a queue that is then moderated 
either by humans — often low-wage workers abroad whose job requires them to look at horrible images so you don’t have 
to — or by algorithms. A decision is made and the content is either left up or removed.

Some platforms, like Facebook, mete out punishment to their users, temporarily suspending them for up to 30 days; 
others may boot users for their first or second infraction. Users are only able to appeal these corporate decisions 
in certain circumstances.

How comfortable you are with this kind of set up depends on your view of speech and who should police it. There are 
different kinds of free speech advocates — some believe that a pluralistic, democratic society is nothing without 
freedom of expression, and that we must protect the rights of all if we want to protect the most vulnerable. The 
“slippery slope” argument is popular, although it’s not always convincing.

[snip]

Dewayne-Net RSS Feed: http://dewaynenet.wordpress.com/feed/
Twitter: https://twitter.com/wa8dzp





-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/18849915-ae8fa580
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20170822084602:D7CB1E96-8737-11E7-8756-F9CC8CAF95B8
Powered by Listbox: http://www.listbox.com

Current thread: