Interesting People mailing list archives

Do Facebook and Google have control of their algorithms anymore? A sobering assessment and a warning


From: "Dave Farber" <farber () gmail com>
Date: Wed, 15 Nov 2017 17:50:14 -0500




Begin forwarded message:

From: Dewayne Hendricks <dewayne () warpspeed com>
Date: November 15, 2017 at 11:00:04 AM EST
To: Multiple recipients of Dewayne-Net <dewayne-net () warpspeed com>
Subject: [Dewayne-Net] Do Facebook and Google have control of their algorithms anymore? A sobering assessment and a 
warning
Reply-To: dewayne-net () warpspeed com

[Note:  This item comes from friend David Rosenthal.  DLH]

Do Facebook and Google have control of their algorithms anymore? A sobering assessment and a warning 
By MELODY KRAMER
Nov 14 2017
<http://amp.poynter.org/news/conversation-about-machine-learning-and-journalism-maciej-ceglowski-pinboard>

If you searched Google immediately after the recent mass shooting in Texas for information on the gunman, you would 
have seen what Justin Hendrix, the head of the NYC Media Lab, called a “misinformation gutter.” A spokesperson for 
Google later gave a statement to Gizmodo that placed blame squarely on an algorithm:

"The search results appearing from Twitter, which surface based on our ranking algorithms, are changing second by 
second and represent a dynamic conversation that is going on in near real-time. For the queries in question, they are 
not the first results we show on the page. Instead, they appear after news sources, including our Top Stories 
carousel which we have been constantly updating. We’ll continue to look at ways to improve how we rank tweets that 
appear in search."

In other words, it was an algorithm — not a human making editorial decisions — that was responsible for this gaffe. 
But as Gizmodo’s Tom McKay pointed out, this kind of framing is intentional and used frequently by Twitter and other 
social networks when problems arise.

He writes: “Google, Twitter, and Facebook have all regularly shifted the blame to algorithms when this happens, but 
the issue is that said companies write the algorithms, making them responsible for what they churn out.”

Algorithms can be gamed, algorithms can be trained on biased information, and algorithms can shield platforms from 
blame. Mike Ananny puts it this way:

By continually claiming that it is a technology company — not a media company — Facebook can claim that any perceived 
errors in Trending Topics or News Feed products are the result of algorithms that need tweaking, artificial 
intelligence that needs more training data, or reflections of users. It claims that it is not taking any editorial 
position.

Platforms rely on these algorithms to perform actions at scale, but algorithms at scale also become increasingly 
inscrutable, even to the people who wrote the code. In her recent TED Talk about the complexity of AI, Zeynep Tufekci 
points out that not even the people behind Facebook’s algorithms truly understand them:

We no longer really understand how these complex algorithms work. We don't understand how they're doing this 
categorization. It's giant matrices, thousands of rows and columns, maybe millions of rows and columns, and not the 
programmers and not anybody who looks at it, even if you have all the data, understands anymore how exactly it's 
operating any more than you'd know what I was thinking right now if you were shown a cross section of my brain. It's 
like we're not programming anymore, we're growing intelligence that we don't truly understand.

This is problematic for journalism. We cannot write about what we cannot see, but we increasingly write about what we 
think will be surfaced by these algorithms, which generate eyeballs, which then generate clicks, which then generate 
an increasingly smaller pool of digital ad dollars (the majority of which are now going to Facebook and Google.)

And despite our (perhaps) growing unease with these platforms, we still rely on the them for distribution. In their 
excellent report on the convergence between publishers and platforms, Emily Bell and Taylor Owen write that “A 
growing number of news organizations see investing in social platforms as the only prospect for a sustainable future, 
whether for traffic or for reach,” echoing what Franklin Foer recently wrote in The Atlantic about The New Republic’s 
increasing dependency on these platforms — and what their algorithms might surface: “Dependence generates desperation 
— a mad, shameless chase to gain clicks through Facebook, a relentless effort to game Google’s algorithms. It leads 
media outlets to sign terrible deals that look like self-preserving necessities: granting Facebook the right to sell 
their advertising, or giving Google permission to publish articles directly on its fast-loading server. In the end, 
such arrangements simply allow Facebook and Google to hold these companies ever tighter.”  

[snip]

Dewayne-Net RSS Feed: http://dewaynenet.wordpress.com/feed/
Twitter: https://twitter.com/wa8dzp





-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/18849915-ae8fa580
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20171115175022:5AC29EAC-CA57-11E7-8A89-C06CDB61554F
Powered by Listbox: http://www.listbox.com

Current thread: