Interesting People mailing list archives

IP: New version of HTTP is expected to cut Web delays


From: David Farber <farber () cis upenn edu>
Date: Mon, 17 Feb 1997 04:39:38 -0500

Unfortunately there is nothing said about what the whitch doctors have
brewed but sounds like they fixed a sloppy design  -- djf


from http://www.nytimes.com/library/cyber/week/021797web.html


February 17, 1997


New Software Expected to Cut Web Delays


By JOHN MARKOFF


SAN FRANCISCO -- The World Wide Wait may be coming to an end. 


The explosive growth of the World Wide Web in the last five years has
created increasing computer traffic jams as the number of users has
continued to outstrip the hardware and data-network resources on which the
Internet is based. 


A World Wide Web Consortium study has shown that redesigning the HTTP
standard would improve basic performance on the Web. 


But now a group of researchers has demonstrated that not all of the
congestion results from the sheer weight of the millions of new users
trying to squeeze onto the Internet. They suggest that a significant part
of the delay has been created by the design of the software underlying the
Web. A study published by the group, based at the World Wide Web Consortium
in Cambridge, Mass., an industry-sponsored group that sets standards, also
shows that a redesign of that software would improve basic performance on
the Web. 


The authors of the report were able to demonstrate data retrieval speeds
twice to eight times as fast as the speed using current World Wide Web
software. 


Individual users are expected to see improvements like faster times to
download information, and the collective benefit could be still greater
because the basic set of conventions, or protocol, for Internet operation
would be used more efficiently. 


Later this year, browsers that support the new protocol are to be
available, though current browsers will continue to work with the new
software. 


"This will be good for the whole Internet," John Klensin, a network
designer at MCI Communications, said. 


The World Wide Web software works in conjunction with the basic software of
the Internet, known as TCP/IP, to permit users to retrieve data without
worrying about where it is on the global Internet. The Internet consists of
a growing collection of software protocols, and the Hypertext Transfer
Protocol -- the "http" at the beginning of many electronic addresses -- has
been the basis of the World Wide Web since 1990. 


"Everyone has known about the problems involving congestion on the
Internet," said Jim Gettys, a Digital Equipment Corporation software
designer who is a visiting scientist at the consortium and is one of the
authors of the study. "What is less well known is that the World Wide Web
protocol has been defeating the congestion control mechanisms in the
Internet's underlying protocols." 


Gettys said that the interaction between Web software and the basic
Internet routing software became an issue as Web use proliferated in recent
times. 


"It didn't matter when HTTP was a tiny fraction of the Internet," he said.
"This problem has only become an issue as HTTP has become a dominant force
in the Internet." 


Although precise measurements are difficult to obtain because there is no
single control point for the entire Internet, most researchers say that
HTTP data now make up the largest category of information flowing through
the Internet. 


Companies like Netscape Communications and Microsoft are readying versions
of their software that are based on the new version of the protocol,
H.T.T.P./1.1. And one of the most common server programs used on the
Internet, the Apache server, has recently added the capability. 


As more computer users begin to convert to Web browsers that support the
new HTTP, they will in many cases see significant decreases in downloading
time as they retrieve information from servers that also have the new
software. 


"You are going to get a lot of improved performance," said John Dawes,
Netscape's group product manager for the servers that are sold to
businesses. The company, based in Mountain View, Calif., is the largest
maker of commercial versions of Internet server programs and browers. It is
testing a version of its server with the new system, and plans to release
the product early in May. 


The consortium's study shows that the most notable improvements are for
Internet users who have high-speed data network connections. In many cases
downloading times were cut in half, and in some instances, the improvement
was as great as eight times the speed. 


For users with slower telephone dial-up connections, the new version of the
HTTP software will offer less direct performance improvements. Users will
typically see about a 20 percent performance improvement. 


The researchers acknowledged that despite the performance improvement with
the new software, congestion is not likely to be eliminated for good,
especially as the Internet continues to be put to new uses. 


Current thread: