Interesting People mailing list archives
Beyond the Rhetoric of Algorithmic Solutionism
From: "Dave Farber" <farber () gmail com>
Date: Thu, 11 Jan 2018 17:00:39 -0500
Begin forwarded message: From: Dewayne Hendricks <dewayne () warpspeed com> Subject: [Dewayne-Net] Beyond the Rhetoric of Algorithmic Solutionism Date: January 11, 2018 at 4:53:51 PM EST To: Multiple recipients of Dewayne-Net <dewayne-net () warpspeed com> Reply-To: dewayne-net () warpspeed com Beyond the Rhetoric of Algorithmic Solutionism By danah boyd Jan 11 2018 <https://points.datasociety.net/beyond-the-rhetoric-of-algorithmic-solutionism-8e0f9cdada53> If you ever hear that implementing algorithmic decision-making tools to enable social services or other high stakes government decision-making will increase efficiency or reduce the cost to taxpayers, know that you’re being lied to. When implemented ethically, these systems cost more. And they should. Whether we’re talking about judicial decision making (e.g., “risk assessment scoring”) or modeling who is at risk for homelessness, algorithmic systems don’t simply cost money to implement. They cost money to maintain. They cost money to audit. They cost money to evolve with the domain that they’re designed to serve. They cost money to train their users to use the data responsibly. Above all, they make visible the brutal pain points and root causes in existing systems that require an increase of services. Otherwise, all that these systems are doing is helping divert taxpayer money from direct services, to lining the pockets of for-profit entities under the illusion of helping people. Worse, they’re helping usher in a diversion of liability because time and time again, those in powerful positions blame the algorithms. This doesn’t mean that these tools can’t be used responsibly. They can. And they should. The insights that large-scale data analysis can offer is inspiring. The opportunity to help people by understanding the complex interplay of contextual information is invigorating. Any social scientist with a heart desperately wants to understand how to relieve inequality and create a more fair and equitable system. So of course there’s a desire to jump in and try to make sense of the data out there to make a difference in people’s lives. But to treat data analysis as a savior to a broken system is woefully naive. Doing so obfuscates the financial incentives of those who are building these services, the deterministic rhetoric that they use to justify their implementation, the opacity that results from having non-technical actors try to understand technical jiu-jitsu, and the stark reality of how technology is used as a political bludgeoning tool. Even more frustratingly, what data analysis does well is open up opportunities for experimentation and deeper exploration. But in a zero-sum context, that means that the resources to do something about the information that is learned is siphoned off to the technology. And, worse, because the technology is supposed to save money, there is no budget for using that data to actually help people. Instead, technology becomes a mirage. Not because the technology is inherently bad, but because of how it is deployed and used. Next week, a new book that shows the true cost of these systems is being published. Virginia Eubanks’ book “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor” is a deeply researched accounting of how algorithmic tools are integrated into services for welfare, homelessness, and child protection. Eubanks goes deep with the people and families who are targets of these systems, telling their stories and experiences in rich detail. Further, drawing on interviews with social services clients and service providers alongside the information provided by technology vendors and government officials, Eubanks offers a clear portrait of just how algorithmic systems actually play out on the ground, despite all of the hope that goes into their implementation. Eubanks eschews the term “ethnography” because she argues that this book is immersive journalism, not ethnography. Yet, from my perspective as a scholar and a reader, this is the best ethnography I’ve read in years. “Automating Inequality” does exactly what a good ethnography should do — it offers a compelling account of the cultural logics surrounding a particular dynamic, and invites the reader to truly grok what’s at stake through the eyes of a diverse array of relevant people. Eubanks brings you into the world of technologically mediated social services and helps you see what this really looks like on the ground. She showcases the frustration and anxiety that these implementations produce; the ways in which both social services recipients and taxpayers are screwed by the false promises of these technologies. She makes visible the politics and the stakes, the costs and the hope. Above all, she brings the reader into the stark and troubling reality of what it really means to be poor in America today. [snip] Dewayne-Net RSS Feed: http://dewaynenet.wordpress.com/feed/ Twitter: https://twitter.com/wa8dzp
------------------------------------------- Archives: https://www.listbox.com/member/archive/247/=now RSS Feed: https://www.listbox.com/member/archive/rss/247/18849915-ae8fa580 Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125 Unsubscribe Now: https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20180111170047:DEFDE7AE-F71A-11E7-A8D3-B53A399BDE86 Powered by Listbox: http://www.listbox.com
Current thread:
- Beyond the Rhetoric of Algorithmic Solutionism Dave Farber (Jan 11)