Interesting People mailing list archives

This Is So Much Bigger Than Facebook: Data misuse is a feature, not a bug—and it’s plaguing our entire culture.


From: "Dave Farber" <farber () gmail com>
Date: Mon, 2 Apr 2018 17:55:45 -0400




Begin forwarded message:

From: the keyboard of geoff goodfellow <geoff () iconia com>
Date: April 2, 2018 at 5:48:30 PM EDT
To: "E-mail Pamphleteer Dave Farber's Interesting People list" <ip () listbox com>
Subject: This Is So Much Bigger Than Facebook: Data misuse is a feature, not a bug—and it’s plaguing our entire 
culture.

This Is So Much Bigger Than Facebook
Data misuse is a feature, not a bug—and it’s plaguing our entire culture.
By ETHAN ZUCKERMAN
Mar 23 2018
<https://www-theatlantic-com.cdn.ampproject.org/c/s/www.theatlantic.com/amp/article/556310/>

After five days of silence, Mark Zuckerberg finally acknowledged the massive data compromise that allowed Cambridge 
Analytica to obtain extensive psychographic information about 50 million Facebook users. His statement, which 
acknowledged that Facebook had made mistakes in responding to the situation, wasn’t much of an apology—Zuckerberg and 
Facebook have repeatedly demonstrated they seem to have a hard time saying they’re sorry.

For me, Zuckerberg’s statement fell short in a very specific way: He’s treating the Cambridge Analytica breach as a 
bad-actor problem when it’s actually a known bug.

In the 17-months-long conversation Americans have been having about social media’s effects on democracy, two distinct 
sets of problems have emerged. The ones getting the most attention are bad-actor problems—where someone breaks the 
rules and manipulates a social-media system for their own nefarious ends. Macedonian teenagers create sensational and 
false content to profit from online ad sales. Disinformation experts plan rallies and counterrallies, calling 
Americans into the streets to scream at each other. Botnets amplify posts and hashtags, building the appearance of 
momentum behind online campaigns like #releasethememo. Such problems are the charismatic megafauna of social-media 
dysfunction. They’re fascinating to watch and fun to study—who wouldn’t be intrigued by the team of Russians in St. 
Petersburg who pretended to be Black Lives Matter activists and anti-Clinton fanatics in order to add chaos to the 
presidential election in the United States? Charismatic m!
 egafauna may be the things that attract all the attention—when really there are smaller organisms, some invisible to 
the naked eye, that can dramatically shift the health of an entire ecosystem.

Known bugs are the set of problems with social media that aren’t the result of Russian agents, enterprising 
Macedonians, or even Steve Bannon, but seem to simply come with the territory of building a social network. People 
are mean online, and bullying, harassment, and mob behavior make online spaces unusable for many people. People tend 
to get stuck in cocoons of unchallenging, ideologically compatible information online, whether these are “filter 
bubbles" created by algorithms, or simply echo chambers built through homophily and people’s friendships with “birds 
of a feather.” Conspiracy theories thrive online, and searching for information can quickly lead to extreme and 
disturbing content.

The Cambridge Analytica breach is a known bug in two senses. Aleksandr Kogan, the Cambridge University researcher who 
built a quiz to collect data on tens of millions of people, didn’t break into Facebook’s servers and steal data. He 
used the Facebook Graph API, which until April 2015 allowed people to build apps that harvested data both from people 
who chose to use the app, and from their Facebook friends. As the media scholar Jonathan Albright put it, “The 
ability to obtain unusually rich info about users’ friends—is due to the design and functionality of Facebook’s Graph 
API. Importantly, the vast majority of problems that have arisen as a result of this integration were meant to be 
‘features, not bugs.’”

In his non-apology, Zuckerberg claimed Facebook had already taken the most “important steps a few years ago in 2014 
to prevent bad actors from accessing people’s information.” But changing the API Kogan used to collect this data is 
only a small part of a much bigger story.

To be clear, I believe Kogan acted unethically in allegedly collecting this data in the first place, and that giving 
this data to Cambridge Analytica was an unforgivable breach of research ethics. But Kogan was able to do this because 
Facebook made it possible, not just for him, but for anyone building apps using the Graph API. When Kogan claims he’s 
being made a scapegoat by both Cambridge Analytica and Facebook, he has a strong case: Selling data to Cambridge 
Analytica is wrong, sure, but Facebook knew that people like Kogan could access the data of millions of users. That’s 
precisely the functionality Facebook advertised to app developers.

Speaking with Laurie Segall on CNN this week, Zuckerberg emphasized that Facebook would investigate other app makers 
to see if anyone else was selling psychographic data they’ve collected through the Graph API. But Zuck didn’t mention 
that Facebook’s business model is based on collecting this demographic and psychographic information and selling the 
ability to target ads to people using this data about them.

[snip]

Dewayne-Net RSS Feed: http://dewaynenet.wordpress.com/feed/
Twitter: https://twitter.com/wa8dzp

-- 
Geoff.Goodfellow () iconia com
living as The Truth is True
http://geoff.livejournal.com  

This message was sent to the list address and trashed, but can be found online.



-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20180402175553:9B039356-36C0-11E8-86A6-99D93A828489
Powered by Listbox: http://www.listbox.com

Current thread: