Interesting People mailing list archives

Facial Recognition Is Accurate, if You're a White Guy


From: "Dave Farber" <dave () farber net>
Date: Mon, 12 Feb 2018 16:03:16 +0000

This has been understood for long long time. djf

---------- Forwarded message ---------
From: Dewayne Hendricks <dewayne () warpspeed com>
Date: Sun, Feb 11, 2018 at 11:35 AM
Subject: [Dewayne-Net] Facial Recognition Is Accurate, if You're a White Guy
To: Multiple recipients of Dewayne-Net <dewayne-net () warpspeed com>


[Note:  This item comes from friend Judi Clark.  DLH]

Facial Recognition Is Accurate, if You’re a White Guy
By STEVE LOHR
Feb 9 2018
<
https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html


Facial recognition technology is improving by leaps and bounds. Some
commercial software can now tell the gender of a person in a photograph.

When the person in the photo is a white man, the software is right 99
percent of the time.

But the darker the skin, the more errors arise — up to nearly 35 percent
for images of darker skinned women, according to a new study that breaks
fresh ground by measuring how the technology works on people of different
races and gender.

These disparate results, calculated by Joy Buolamwini, a researcher at the
M.I.T. Media Lab, show how some of the biases in the real world can seep
into artificial intelligence, the computer systems that inform facial
recognition.

In modern artificial intelligence, data rules. A.I. software is only as
smart as the data used to train it. If there are many more white men than
black women in the system, it will be worse at identifying the black women.

One widely used facial-recognition data set was estimated to be more than
75 percent male and more than 80 percent white, according to another
research study.

The new study also raises broader questions of fairness and accountability
in artificial intelligence at a time when investment in and adoption of the
technology is racing ahead.

Today, facial recognition software is being deployed by companies in
various ways, including to help target product pitches based on social
media profile pictures. But companies are also experimenting with face
identification and other A.I. technology as an ingredient in automated
decisions with higher stakes like hiring and lending.

Researchers at the Georgetown Law School estimated that 117 million
American adults are in face recognition networks used by law enforcement —
and that African Americans were most likely to be singled out, because they
were disproportionately represented in mug-shot databases.

Facial recognition technology is lightly regulated so far.

“This is the right time to be addressing how these A.I. systems work and
where they fail — to make them socially accountable,” said Suresh
Venkatasubramanian, a professor of computer science at the University of
Utah.

Until now, there was anecdotal evidence of computer vision miscues, and
occasionally in ways that suggested discrimination. In 2015, for example,
Google had to apologize after its image-recognition photo app initially
labeled African Americans as “gorillas.”

Sorelle Friedler, a computer scientist at Haverford College and a reviewing
editor on Ms. Buolamwini’s research paper, said experts had long suspected
that facial recognition software performed differently on different
populations.

“But this is the first work I’m aware of that shows that empirically,” Ms.
Friedler said.

Ms. Buolamwini, a young African-American computer scientist, experienced
the bias of facial recognition firsthand. When she was an undergraduate at
the Georgia Institute of Technology, programs would work well on her white
friends, she said, but not recognize her face at all. She figured it was a
flaw that would surely be fixed before long.

[snip]

Dewayne-Net RSS Feed: http://dewaynenet.wordpress.com/feed/
Twitter: https://twitter.com/wa8dzp



-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/18849915-ae8fa580
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20180212110343:44DB3FB6-100E-11E8-BA45-FCA52D38B4B6
Powered by Listbox: http://www.listbox.com

Current thread: