Interesting People mailing list archives
Re Facial Recognition Is Accurate, if You're a White Guy
From: "Dave Farber" <dave () farber net>
Date: Mon, 12 Feb 2018 23:54:59 +0000
---------- Forwarded message --------- From: Jonathan M. Smith <jms () cis upenn edu> Date: Mon, Feb 12, 2018 at 6:32 PM Subject: Re: [IP] Facial Recognition Is Accurate, if You're a White Guy To: Dave Farber <dave () farber net>, <dewayne () warpspeed com> Dave, Dewayne: Please see: http://openaccess.thecvf.com/content_cvpr_workshops_2015/W02/papers/Gibson_The_Emperors_New_2015_CVPR_paper.pdf -JMS Jonathan M. Smith Professor of CIS and Pompa Chair T: 215.898.9509; E: jms () cis upenn edu
On Feb 12, 2018, at 8:03 AM, Dave Farber <dave () farber net> wrote: This has been understood for long long time. djf ---------- Forwarded message --------- From: Dewayne Hendricks <dewayne () warpspeed com> Date: Sun, Feb 11, 2018 at 11:35 AM Subject: [Dewayne-Net] Facial Recognition Is Accurate, if You're a White
Guy
To: Multiple recipients of Dewayne-Net <dewayne-net () warpspeed com> [Note: This item comes from friend Judi Clark. DLH] Facial Recognition Is Accurate, if You’re a White Guy By STEVE LOHR Feb 9 2018 <
https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html
Facial recognition technology is improving by leaps and bounds. Some
commercial software can now tell the gender of a person in a photograph.
When the person in the photo is a white man, the software is right 99
percent of the time.
But the darker the skin, the more errors arise — up to nearly 35 percent
for images of darker skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender.
These disparate results, calculated by Joy Buolamwini, a researcher at
the M.I.T. Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition.
In modern artificial intelligence, data rules. A.I. software is only as
smart as the data used to train it. If there are many more white men than black women in the system, it will be worse at identifying the black women.
One widely used facial-recognition data set was estimated to be more than
75 percent male and more than 80 percent white, according to another research study.
The new study also raises broader questions of fairness and
accountability in artificial intelligence at a time when investment in and adoption of the technology is racing ahead.
Today, facial recognition software is being deployed by companies in
various ways, including to help target product pitches based on social media profile pictures. But companies are also experimenting with face identification and other A.I. technology as an ingredient in automated decisions with higher stakes like hiring and lending.
Researchers at the Georgetown Law School estimated that 117 million
American adults are in face recognition networks used by law enforcement — and that African Americans were most likely to be singled out, because they were disproportionately represented in mug-shot databases.
Facial recognition technology is lightly regulated so far. “This is the right time to be addressing how these A.I. systems work and
where they fail — to make them socially accountable,” said Suresh Venkatasubramanian, a professor of computer science at the University of Utah.
Until now, there was anecdotal evidence of computer vision miscues, and
occasionally in ways that suggested discrimination. In 2015, for example, Google had to apologize after its image-recognition photo app initially labeled African Americans as “gorillas.”
Sorelle Friedler, a computer scientist at Haverford College and a
reviewing editor on Ms. Buolamwini’s research paper, said experts had long suspected that facial recognition software performed differently on different populations.
“But this is the first work I’m aware of that shows that empirically,”
Ms. Friedler said.
Ms. Buolamwini, a young African-American computer scientist, experienced
the bias of facial recognition firsthand. When she was an undergraduate at the Georgia Institute of Technology, programs would work well on her white friends, she said, but not recognize her face at all. She figured it was a flaw that would surely be fixed before long.
[snip] Dewayne-Net RSS Feed: http://dewaynenet.wordpress.com/feed/ Twitter: https://twitter.com/wa8dzp Archives | Modify Your Subscription | Unsubscribe Now
------------------------------------------- Archives: https://www.listbox.com/member/archive/247/=now RSS Feed: https://www.listbox.com/member/archive/rss/247/18849915-ae8fa580 Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125 Unsubscribe Now: https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20180212185521:2B3924A0-1050-11E8-8C0C-904EF9B7B72A Powered by Listbox: http://www.listbox.com
Current thread:
- Facial Recognition Is Accurate, if You're a White Guy Dave Farber (Feb 12)
- Message not available
- Re Facial Recognition Is Accurate, if You're a White Guy Dave Farber (Feb 12)
- Message not available