Interesting People mailing list archives

The rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel


From: "Dave Farber" <dave () farber net>
Date: Mon, 27 Feb 2017 07:41:55 +0000

---------- Forwarded message ---------
From: Dewayne Hendricks <dewayne () warpspeed com>
Date: Mon, Feb 27, 2017 at 2:03 AM
Subject: [Dewayne-Net] The rise of artificial intelligence is creating new
variety in the chip market, and trouble for Intel
To: Multiple recipients of Dewayne-Net <dewayne-net () warpspeed com>


The rise of artificial intelligence is creating new variety in the chip
market, and trouble for Intel
The success of Nvidia and its new computing chip signals rapid change in IT
architecture
Feb 25 2017
<
http://www.economist.com/news/business/21717430-success-nvidia-and-its-new-computing-chip-signals-rapid-change-it-architecture


“WE ALMOST went out of business several times.” Usually founders don’t talk
about their company’s near-death experiences. But Jen-Hsun Huang, the boss
of Nvidia, has no reason to be coy. His firm, which develops
microprocessors and related software, is on a winning streak. In the past
quarter its revenues increased by 55%, reaching $2.2bn, and in the past 12
months its share price has almost quadrupled.

A big part of Nvidia’s success is because demand is growing quickly for its
chips, called graphics processing units (GPUs), which turn personal
computers into fast gaming devices. But the GPUs also have new
destinations: notably data centres where artificial-intelligence (AI)
programmes gobble up the vast quantities of computing power that they
generate.

Soaring sales of these chips (see chart) are the clearest sign yet of a
secular shift in information technology. The architecture of computing is
fragmenting because of the slowing of Moore’s law, which until recently
guaranteed that the power of computing would double roughly every two
years, and because of the rapid rise of cloud computing and AI. The
implications for the semiconductor industry and for Intel, its dominant
company, are profound.

Things were straightforward when Moore’s law, named after Gordon Moore, a
founder of Intel, was still in full swing. Whether in PCs or in servers
(souped-up computers in data centres), one kind of microprocessor, known as
a “central processing unit” (CPU), could deal with most “workloads”, as
classes of computing tasks are called. Because Intel made the most powerful
CPUs, it came to rule not only the market for PC processors (it has a
market share of about 80%) but the one for servers, where it has an almost
complete monopoly. In 2016 it had revenues of nearly $60bn.

This unipolar world is starting to crumble. Processors are no longer
improving quickly enough to be able to handle, for instance, machine
learning and other AI applications, which require huge amounts of data and
hence consume more number-crunching power than entire data centres did just
a few years ago. Intel’s customers, such as Google and Microsoft together
with other operators of big data centres, are opting for more and more
specialised processors from other companies and are designing their own to
boot.

Nvidia’s GPUs are one example. They were created to carry out the massive,
complex computations required by interactive video games. GPUs have
hundreds of specialised “cores” (the “brains” of a processor), all working
in parallel, whereas CPUs have only a few powerful ones that tackle
computing tasks sequentially. Nvidia’s latest processors boast 3,584 cores;
Intel’s server CPUs have a maximum of 28.

The company’s lucky break came in the midst of one of its near-death
experiences during the 2008-09 global financial crisis. It discovered that
hedge funds and research institutes were using its chips for new purposes,
such as calculating complex investment and climate models. It developed a
coding language, called CUDA, that helps its customers program its
processors for different tasks. When cloud computing, big data and AI
gathered momentum a few years ago, Nvidia’s chips were just what was needed.

Every online giant uses Nvidia GPUs to give their AI services the
capability to ingest reams of data from material ranging from medical
images to human speech. The firm’s revenues from selling chips to
data-centre operators trebled in the past financial year, to $296m.

And GPUs are only one sort of “accelerator”, as such specialised processors
are known. The range is expanding as cloud-computing firms mix and match
chips to make their operations more efficient and stay ahead of the
competition. “Finding the right tool for the right job”, is how Urs Hölzle,
in charge of technical infrastructure at Google, describes balancing the
factors of flexibility, speed and cost.

[snip]

Dewayne-Net RSS Feed: <http://dewaynenet.wordpress.com/feed/>



-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/18849915-ae8fa580
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20170227024212:3E213BCE-FCC0-11E6-A3E7-D9462402F9C5
Powered by Listbox: http://www.listbox.com

Current thread: