Inet 101 - A Brief History of the Internet Part 1

From EDM2
Jump to: navigation, search

Written by Marco J. Shmerykowsky

Introduction

This is the first installment of a short series of basic articles on the Internet. It is not meant to be a technical discussion of the Internet's inner workings. It will not discuss topics such as the differences between Class A, B, and C addresses. Rather, the articles will provide an overview of how to use basic Internet tools such as e-mail. The target audience is the "new" user who, for example, understands how to use an e-mail client program but doesn't realize that mailing 5Mb attachments isn't the greatest thing. If you're not this type of user, then I'm sure you know someone that is. With that said, let's start by taking a brief look at where the "internet" came from.

The Internet is one of the key technological developments of the 20th Century. It has allowed for the development of an unprecedented amount of information sharing. As with most new technologies, businesses have been creating new methods and systems which would leverage these developments for strategic business advantages. Some of the most useful systems have centered on the creation of computerized information networks known as "intranets" and "extranets." These networking concepts trace their routes to the global information infrastructure known as the Internet.

The origins of the Internet can be traced back to the mid 1960's when a group of scientists working for the United States Defense Department's Advanced Research Projects Agency (ARPA) began to develop a "fault tolerant" computer network which could be used to effectively share information between various geographically dispersed research institutions. The first step to developing this wide area network centered on creating a method of electronic communication which was more efficient than "circuit switching." With this method, a single route or circuit would be established between the source and the destination. The main weakness of this direct point to point method was the lack of redundancy. If the "circuit" broke during data transmission, then the information would be incomplete and perhaps unusable.

In July 1961, Leonard Kleinrock of the Massachusetts Institute of Technology published a paper which addressed this need for fault tolerance by proposing a "packet switching theory." According to this theory, information would be transmitted from the source to the destination in a series of discrete "packets." These packets would contain at a minimum a small portion of the data being transmitted, the source address, and the destination address. Since each packet contains a source and destination address, the packet can be routed through the most direct route. The perfect analogy for this theory of communication is the postal system. In this system, a letter travelling from New York to Los Angles can pass through several different cities. If a certain city is not receiving mail, then the letter can be rerouted through an alternate facility enroute to its final destination.

In 1967 Lawrence G. Roberts published a plan for a packet switched network which would be known as ARPANET and by the end of 1969 four host computers had been connected together to form a functioning network. This early network marks the first creative stage of the modern Internet.

The second stage essentially begins by 1972 when Robert Kahn introduced the idea of "open-architecture" networking. This idea was founded on the concepts of reliability, simplicity, and independence. The aspect of reliability was addressed by the allowing communication to take place on a "best-effort" basis. Under this method, if a packet failed to reach its proper destination, then it would be promptly retransmitted from the source. The concept of simplicity was engaged as a result of basic engineering logic. If a device is simple, then it has a better chance of functioning without error due to the fact that there are fewer things which can go wrong. Finally, the directive of independence stated that distinct networks should be able to connect to the Internet without needing "internal" modifications. The idea of "open-architecture" networking led to the development of a common protocol that eventually came to be known as the Transmission Control Protocol / Internet Protocol (TCP/IP). By 1980 the military adopted this protocol as a "defense standard," and on January 1, 1983 the ARPANET was converted from the original network control protocol (NCP) to the common TCP/IP.

The third stage of the Internet's development came in the mid 1980's when the U.S National Science Foundation (NSF) began to develop a computer network which would serve the general academic and research community through the application of "open-architecture" policies centered around the TCP/IP protocol. This network was officially called NSFNET. As the network expanded, the NSF began to encourage regional providers of network services to expand beyond academic and research institutions to the commercial sector. The intent behind this expansion was to engage private funds into the expansion and support of the networking facilities. As the number of private companies investing in the network increased, the cost of connecting to the network should decrease. By April of 1995 the NSF's privatization policy allowed the government to stop funding of the NSFNET backbone. At this point the "modern" Internet was born.

The computer network which started as a project funded by the Defense Department has undergone a tremendous explosion of growth in the few years since the NSF's privatization policy was completed. Companies such as Netscape and Yahoo! which didn't exist a few years ago are now among the leading technology companies being traded on the stock market. Corporations such as IBM and Microsoft, which have dominated the computer industry for the past ten years, have been forced to completely realign their entire product lines.

The cause of the changes which have reshaped the computer industry in recent years revolves around the concept of simpler access to information. Just as the development of radio and television allowed the public to be instantly informed about events on the other side of the planet, the Internet has allowed people to instantly uncover any information that is pertinent to a particular topic. For example, instead of "comparison shopping" by physically going to a number of stores, a consumer can now perform a price based search on the Internet from the comfort of their home. This translates into a significant time savings and convenience for the consumer. The technological developments that allow people to comparison shop from home also give corporations the ability to leverage their information stores to gain a competitive advantage.

Now that we've completed our history lesson it's time to tackle some of the basic Internet tools that computer user's have access to. Next time I'll tackle the basics of e-mail.