The Rise of the Internet Part 1: Exponential Growth





<< Before: Era of Fragmentation, Part 4: Anarchists



In 1990, John Quaterman , a networking consultant and UNIX expert, published a comprehensive overview of the state of computer networks at the time. In a small section on the future of computing, he predicted the emergence of a single global network for "e-mail, conferencing, file transfer, remote login - just as there is the worldwide telephone network and worldwide mail today." However, he did not give a special role to the Internet. He suggested that this worldwide network "is likely to be operated by government communications services" other than the United States "where it will be operated by the regional divisions of Bell Operating Companies and long distance operators."



The purpose of this article is to explain how, with its sudden explosive exponential growth, the Internet has so grossly refuted perfectly natural assumptions.



Passing the baton



The first critical event that led to the emergence of the modern Internet occurred in the early 1980s, when the Defense Communication Agency (DCA) [now DISA] decided to split the ARPANET in two. DCA took over control of the network in 1975. By then, it was clear that there was no reason for the Information Processing Technology (IPTO) department of ARPA, a theoretical research organization, to participate in the development of a network that was used not for communication research, but for everyday communication. ARPA unsuccessfully attempted to shove control of the network from the private company AT&T. The DCA, in charge of military communications, seemed like the best second option.



For the first few years of the new situation, ARPANET flourished in a mode of blissful disregard. By the early 1980s, however, the Department of Defense's aging communications infrastructure was in desperate need of an upgrade. The proposed replacement project, AUTODIN II, which DCA had contracted with Western Union, appeared to have failed. Then the heads of the DCA put Colonel Heidi Hayden in charge of choosing the alternative. He proposed using packet-switched technology that DCA already had in the form of the ARPANET as the basis for a new defense data network.



However, there was an obvious problem with the transmission of military data via ARPANET - the network was rife with long-haired scientists, some of whom actively opposed computer security or secrecy - for example,Richard Stallmanwith fellow hackers from the MIT Artificial Intelligence Lab. Hayden proposed to split the network into two parts. He decided to keep the ARPA-funded research scientists on the ARPANET, while the computers working in the defense industry were allocated to a new network called MILNET. This mitosis had two important consequences. First, the division of the military and non-military parts of the network was the first step towards transferring the Internet to civil, and subsequently to private management. Second, it was proof of the viability of a fruitful Internet technology - the TCP / IP protocols, first invented five years earlier. DCA needed all ARPANET nodes to migrate from legacy protocols to TCP / IP by early 1983. At that time, few networks used TCP / IP, but after this process, the two networks of the proto-Internet were connected,allowing message traffic to link research and military enterprises as needed. To ensure the longevity of TCP / IP in military networks, Hayden set up a $ 20 million fund to support computer manufacturers who will write software to implement TCP / IP on their systems.



The first step in the gradual transfer of the Internet from military to private control also gives us a good opportunity to say goodbye to ARPA and IPTO. His funding and influence, led by Joseph Carl Robnett Licklider, Ivan Sutherland and Robert Taylor, directly and indirectly led to all the early developments in interactive computing and computer networking. However, with the creation of the TCP / IP standard in the mid-1970s, it last played a key role in the history of computers.



The next major computing project organized by DARPA will be the 2004-2005 Autonomous Vehicle Competition. The most famous project before that will be a billion-dollar strategic AI computing initiative of the 1980s that will spawn several useful military applications with little or no impact on civil society.



The Vietnam War was a decisive catalyst in the organization's loss of influence.... Most academic researchers believed they were fighting for a just cause and defending democracy when the military funded research during the Cold War. However, those who grew up in the 1950s and 1960s lost faith in the military and their goals after the latter got bogged down in the swamp of the Vietnam War. Among the first was Taylor himself, who left IPTO in 1969, taking his ideas and connections to Xerox PARC. The Democratic-controlled Congress, worried about the devastating effect of war money on basic scientific research, passed amendments according to which defense money had to be spent exclusively on military research. ARPA reflected this change in funding culture in 1972 by renaming itself DARPA, the Defense Advanced Research Projects Agency of the United States Department of Defense .



Therefore, the baton passed to the civilian National Science Foundation (NSF). By 1980, with a budget of $ 20 million, NSF was responsible for funding about half of federal computer research programs in the United States. And most of these funds will soon be directed to the new national computer network NSFNET .



NSFNET



In the early 1980s, Larry Smarr, a physicist at the University of Illinois, visited the Institute. Max Planck in Munich, where the Cray supercomputer operated, to which European researchers were allowed access. Frustrated by the lack of similar resources for US scientists, he proposed that NSF finance the creation of several supercomputing centers across the country. The organization responded to the claims of Smarr and other researchers with similar complaints by creating a department for advanced scientific computing in 1984, which led to the funding of five such centers with a five-year budget of $ 42 million. They stretch from Cornell University in the northeast of the country to San Diego in the South-West. Situated in between, the University of Illinois, where Smarr worked, received its own center, the National Center for Supercomputing Applications, NCSA.



However, the capacity of the centers to improve access to computing power was limited. Using their computers would be difficult for users outside of one of the five centers, and would require funding for semester- or summer-long scientific travel. So NSF decided to build a computer network as well. History repeated itself - Taylor pushed the ARPANET in the late 1960s precisely to give the research community access to powerful computing resources. NSF will provide a backbone that will link key supercomputing centers across the continent, and then connect regional grids to provide access to these centers to other universities and research laboratories. NSF will take advantage of the Internet protocols that Hayden promoted,transferring responsibility for the creation of local networks to local scientific communities.



NSF initially referred the tasks of building and maintaining the NCSA network from the University of Illinois as the source of the initial proposal for a national supercomputing program. In turn, NCSA leased the same 56 kbps lines that ARPANET had been using since 1969 and launched the network in 1986. However, these lines quickly became clogged with traffic (details of this process can be found in David Mills' The NSFNET BackboneAnd again the history of ARPANET repeated itself - it quickly became obvious that the main task of the network should not be access of scientists to computer power, but the exchange of messages between people who had access to it. You can forgive the authors of ARPANET for not knowing that this could happen - but how could the same mistake be repeated almost twenty years later? One possible explanation is that it is much easier to justify a seven-figure grant for the use of computing power, the cost of which is frivolous goals, such as the ability to exchange emails.It cannot be said that the NSF deliberately misled someone.But as an anthropic principle, it claims that the physical constants of the Universe are what they are, becausethat otherwise we would simply not exist, and we would not be able to observe them, and I would not have to write about a computer network with government funding, if there were no similar, somewhat fictitious excuses for its existence.



After convincing itself that the network itself was at least as valuable as the supercomputers that justify its existence, NSF turned to outside help to upgrade the backbone of the network with T1 lines (1.5 Mbps /from). The T1 standard was founded by AT&T in the 1960s and was supposed to handle up to 24 phone calls, each of which was encoded into a 64 kbps digital stream.



The contract was won by Merit Network, Inc. in partnership with MCI and IBM, and received a $ 58 million grant from NSF in the first five years to build and maintain the network. MCI provided communications infrastructure, IBM provided computing power and software for routers. The not-for-profit company Merit, which operated the computer network that connected the University of Michigan campuses, brought with it the expertise of maintaining a scientific computer network, and gave the partnership a university spirit that made it easier for NSF and scientists using NSFNET to embrace it. However, the handover from NCSA to Merit was an obvious first step towards privatization.



MERIT originally stood for Michigan Educational Research Information Triad. The state of Michigan added $ 5 million of its own to help the T1 home network grow.







More than a dozen regional networks passed through the Merit backbone, from NYSERNet, a New York research and education network connected to Ithaca with Cornell University, to CERFNet, a California federated research and education network connected in San Diego. Each of these regional networks connected to countless local campus grids, as hundreds of Unix machines were running in college labs and faculty offices. This federal network of networks has become the seed crystal of the modern Internet. ARPANET linked only well-funded informatics researchers working in elite scientific institutions. And by 1990, almost any university student or teacher could go online.By transferring packets from node to node - over the local Ethernet, then further into the regional network, then over long distances at the speed of light along the NSFNET backbone - they could exchange emails or casually chat on Usenet with colleagues from other parts of the country.



After many more scientific organizations became available through NSFNET than through ARPANET, in 1990 DCA decommissioned the legacy network and completely eliminated the Department of Defense from developing civilian networks.



Takeoff



Over this entire period, the number of computers connected to NSFNET and networks connected to it - and we can now call all this the Internet - has approximately doubled every year. 28,000 in December 1987, 56,000 in October 1988, 159,000 in October 1989, and so on. This trend continued until the mid-1990s, and then growth slowed slightly . How, I wonder, given this trend, Quaterman could have failed to notice that the internet is destined to rule the world? If the recent epidemic has taught us anything, it is that it is very difficult for a person to imagine exponential growth, since it does not correspond to anything we encounter in everyday life.



Of course, the name and concept of the Internet predates NSFNET. The Internet Protocol was invented in 1974, and even before NSFNET there were networks that communicated over IP. We have already mentioned ARPANET and MILNET. However, I was unable to find any mention of the "Internet" - a single, worldwide network of networks - until the advent of the three-tiered NSFNET.



The number of grids within the Internet grew at a similar rate - from 170 in July 1988 to 3,500 in the fall of 1991. Since the scientific community knows no boundaries, many of them have been located abroad, starting with the links with France and Canada established in 1988. By 1995, the Internet could be accessed from nearly 100 countries, from Algeria to Vietnam. And although the number of machines and networks is much easier to calculate than the number of real users, according to reasonable estimates, by the end of 1994 there were 10-20 million. In the absence of detailed data on who, why and at what time used the Internet, it is rather difficult to prove or some other historical explanation for such incredible growth. A small collection of stories and anecdotes can hardly explain how 350,000 computers were connected to the Internet from January 1991 to January 1992, and 600,000 over the next year.and for the next - another 1.1 million.



However, I dare to step into this epistemically shaky territory and claim that the three overlapping waves of users, each with their own reasons for connecting, responsible for the explosive growth of the Internet, were caused by the inexorable logic of Metcalfe's Law , which says, that the value (and, therefore, the force of attraction) of the network increases as the square of the number of its participants.



Scientists came first. NSF deliberately spread computing to as many universities as possible. After that, every scientist wanted to join the project, because everyone else was already there. If you might not have received emails, if you might not have seen and participated in the most recent discussions on Usenet, you risked missing an important conference announcement, a chance to find a mentor, not to notice cutting edge research before it was published, and so on. Feeling compelled to join scholarly conversations online, universities were quick to connect to regional grids that could connect them to the NSFNET backbone. For example, NEARNET, which covered six states in the New England region, had over 200 members by the early 1990s.



At the same time, access began to seep from faculty and graduate students into a much larger student community. By 1993, roughly 70% of Harvard freshmen had an email address. By that time, the Internet at Harvard had physically reached every corner and associated institutions. The university went to considerable expense to bring Ethernet not just to every building in the educational institution, but to all student dormitories. Surely there was very little time left until the moment when one of the students was the first to burst into his room after a stormy night, fell into a chair and with difficulty tapped an email message, which he regretted sending the next morning - whether it was a declaration of love or a violent rebuke the enemy.



On the next wave, around 1990, commercial users began to arrive. In that year, 1,151 .com domains were registered. The first commercial participants were research departments of technology companies (Bell Labs, Xerox, IBM, etc.). They essentially used the web for scientific purposes. Business communication between their leaders went through other networks. However, by 1994, there were over 60,000 .com domain names, and making money on the Internet really began.



By the late 1980s, computers began to become a part of the daily work and home life of US citizens, and the importance of a digital presence for any serious business became evident. Email offered a way to easily and extremely quickly exchange messages with colleagues, customers and suppliers. Mailing lists and Usenet offered both new ways to keep abreast of developments in the professional community, and new forms of very cheap advertising to a wide range of users. A huge variety of free databases — legal, medical, financial, and political — could be accessed over the Internet. Yesterday's students who took jobs living in connected dormitories fell in love with the Internet as much as their employers. It offered access to a much larger set of users,than any of the individual commercial services (and again Metcalfe's Law). After paying for a month's Internet access, you could get almost everything else for free, in contrast to the significant cost of paying for the hours used or for sent messages, which CompuServe and other similar services asked for. Early entrants to the Internet market included mail order companies such as The Corner Store in Litchfield, Connecticut, advertised on Usenet groups, and The Online Bookstore, an e-book store founded by a former editor at Little, Brown and Company, and more than ten years ahead of the Kindle.as opposed to the significant cost of billing for hours used or for messages sent, which CompuServe and other similar services requested. Early entrants to the Internet market included mail order companies such as The Corner Store in Litchfield, Connecticut, advertised on Usenet groups, and The Online Bookstore, an e-book store founded by a former editor at Little, Brown and Company, and more than ten years ahead of the Kindle.as opposed to the significant cost of billing for hours used or for messages sent, which CompuServe and other similar services requested. Early entrants to the Internet market included mail order companies such as The Corner Store in Litchfield, Connecticut, advertised on Usenet groups, and The Online Bookstore, an e-book store founded by a former editor at Little, Brown and Company, and more than ten years ahead of the Kindle.Brown and Company, and is more than a decade ahead of the Kindle.Brown and Company, and is more than a decade ahead of the Kindle.



And then came the third wave of growth, bringing in ordinary consumers who started surfing the Internet in large numbers in the mid-1990s. By this time, Metcalfe's Law was already in top gear. Increasingly, “being online” meant “being on the Internet”. Consumers could not afford to extend home leased lines of T1 class, so they almost always went online via a dialup modem... We have already seen a part of this story when commercial BBSs gradually evolved into Internet providers. This change has benefited both users (whose digital pool has suddenly grown to the ocean) and the BBS themselves, who have moved into the much simpler business of intermediary between the telephone system and the T1 Internet backbone without having to maintain their own services.



Larger online services have evolved along the same lines. By 1993, all nationwide services in the United States - Prodigy, CompuServe, GEnie, and the fledgling America Online (AOL) - were offering users, a total of 3.5 million, the ability to send email to Internet addresses. Only the lagging Delphi (with 100,000 subscribers) offered full Internet access. However, in the next few years, the value of Internet access, which continued to grow at an exponential rate, quickly outweighed access to their own forums, games, shops and other content from the commercial services themselves. 1996 was a watershed year - by October, 73% of users who went online were using the WWW, up from 21% the year before. A new term "portal" was coined to describe the rudimentary remnants of the services provided by AOL,Prodigy and the rest of the companies that people paid money to just to get online.



Secret ingredient



So, we roughly got acquainted with how the Internet grew at such an explosive rate, but we did not understand enough why this happened. Why did it become so dominant with such a variety of many other services trying to grow in the preceding era of fragmentation ?



Of course, government subsidies have played a role. Aside from funding backbone communications, when NSF decided to seriously invest in network development regardless of its supercomputer development program, it was not trifling. Conceptual leaders of the NSFNET program, Steve Wolfe and Jane Cavines, decided to build not just a network of supercomputers, but a new information infrastructure for American colleges and universities. So they established the Connections program, which took on a portion of the cost of connecting universities to the network in exchange for providing as many people as possible with the network on their campuses. This has accelerated the spread of the Internet, both directly and indirectly. Indirectly - since many of the regional grids spawned commercial enterprises using the same infrastructure,built on subsidies to sell Internet access to commercial organizations.



But Minitel also had subsidies. Most of all, however, the Internet was distinguished by its multilayered, decentralized structure and its inherent flexibility. IP allowed networks of vastly different physical properties to work with the same addressing system, while TCP ensured that packets were delivered to their destination. And that's all. The simplicity of the basic scheme of the network made it possible to build on it almost any application. Importantly, any user could contribute in the form of new functionality if they were able to convince others to use their program. For example, FTP file transfer was one of the most popular ways to use the Internet in the early years, but it was impossible to find servers offering files of interest other than word of mouth.Therefore, enterprising users have created various protocols for cataloging and maintaining lists of FTP servers - for example, Gopher, Archie, and Veronica.



In theory, the OSI networking model had the same flexibility, as well as the official blessing of international organizations and telecommunications giants as an interconnection standard. In practice, however, the field was left to TCP / IP, and its decisive advantage was the code that runs first on thousands and then on millions of machines.



The transfer of application layer control to the very edges of the network had other important consequences. This meant that large organizations, accustomed to managing their own field, could feel comfortable. Organizations could set up their own email servers and send and receive emails without having their entire content stored on someone else's computer. They could register their own domain names, set up their own websites with access for everyone who went online, but left them completely under their control.



Naturally, the most prominent example of multilayer structure and decentralization is the worldwide web. For two decades, systems from the time-sharing computers of the 1960s to services like CompuServe and Minitel revolved around a small set of basic communication services — email, forums, and chat rooms. The web has become something fundamentally new. The early days of the web, when it was composed entirely of unique, hand-crafted pages, is nothing like its current state. However, jumping from link to link had a strange appeal even then, and gave businesses the ability to provide extremely cheap advertising and customer support. None of the architects of the Internet planned for the web. It was the fruit of the work of Tim Berners-Lee, British engineer at the European Center for Nuclear Research (CERN),who created it in 1990 for the purpose of convenient dissemination of information among laboratory researchers. However, he easily lived on TCP / IP and used a domain name system created for other purposes for ubiquitous URLs. Anyone with Internet access could create a website, and by the mid-90s it seemed that everyone had done so - city halls, local newspapers, small businesses, and amateurs of all stripes.



Privatization



In this story about the rise of the Internet, I have omitted a few important events, and you may have a few questions left. For example, how exactly did businesses and consumers gain access to the Internet, which was originally centered around NSFNET, a US government-funded network supposedly designed to serve the research community? To answer this question, in the next article we will return to some important events that I have not yet mentioned; events that gradually but inevitably turned the state scientific Internet into a private and commercial one.



What else to read



  • Janet Abatte, Inventing the Internet (1999)
  • Karen D. Fraser “NSFNET: A Partnership for High-Speed ​​Networking, Final Report” (1996)
  • John S. Quarterman, The Matrix (1990)
  • Peter H. Salus, Casting the Net (1995)



All Articles