A New Communication Paradigm
by Ronda Hauben
"...the systems being build must remain flexible and open-ended throughout the process of development, which is evolutionary."
J.C.R. Licklider and Robert Taylor
The Computer as a Communication Device
"Computers need a language of their own to communicate with each other and with their users."
Proceedings of IEEE
Special Issue on Packet Communication Networks
"Experience has shown the importance of making the response time short and the conversation free and easy."
J.C.R. Licklider and Robert Taylor
The Computer as a Commuication Device
I. Can the human-computer partnership improve communication
"In a few years, men will be able to communicate more effectively through a machine than face to face," write J.C.R. Licklider and Robert Taylor in their 1968 article, "The Computer as a Communication Device." (1)
In a memo written several years earlier, Licklider raises a related question: How do you state the fundamental problem concerning communication? "At the extreme," he writes, "the problem is essentially the one discussed by science fiction writers: how do you get communication started among totally uncorrelated sapient beings?" (2)
In the same memo, Licklider also poses the question: If you are gathering together different groups of people using different computers and different programming languages, isn't it necessary to find the primary question that has to be asked? All the different computer systems have to either agree to speak the same language, or at least agree to some convention for asking this fundamental question: "What language do you speak?"
Licklider was writing in the 1960s during the earliest days of the efforts to link together computers to facilitate resource sharing. The questions he raises are questions about the fundamental nature of communication. Has the development of computer networking in the past 30 years shed any light on the fundamental nature of communication? This is the question that this paper will endeavor to answer.
II. Different Networks-Diverse Views-Broad Ranging Discussion
A discussion carried out on Usenet in several newsgroups before Thanksgiving of 1998, is a helpful example of the new kind of online discussion that the wide ranging reach of the Internet as a network of networks makes possible. A number of people from the U.S. and Europe participated in the discussion. An examination of this discussion I hope will shed light on two questions:  How the Internet impacts human to human communication? and  How can the communication made possible by the Internet help with particular problems that arise in the continued development of the Internet? (3)
The discussion began on November 18 with a comment that I made in a thread on several newsgroups about "Realizing the promise of computers". I responded to a post by John Adams who had been reading Bell Labs publications from the mid 1960s and was struck by the "failure of current business information systems to realize many of the envisioned goals."
My response supported Adams' statement that we haven't met the goals of the 1960s:
"And we have lost Bell Labs as well"
In response, came a post from Dennis Ritchie, the co-inventor of Unix, created in 1969 at Bell Labs. (4)
Ritchie posted responding:
"Ronda Hauben wrote:
> And we have lost Bell Labs as well.
(Pinches oneself). No, still alive.
A few other commentators supported what Ritchie had said.
Then another person responded to those who supported Ritchie's view that Bell Labs still existed. Arthur T.
"No, Dave Farber isn't worried; Esther Dyson isn't worried.
The drunks on the barstools are not worried, nor are the
computer complacent who poke fun at Ronda Hauben's minor gaffes.
But a hundred and some nations around the world who are about
to get disenfranchised from the once free Internet must be a
little worried by now, judging from the recent telecommunications
meeting at which they tried to resist the US govt privatization.
Oh, excuse me (Arthur T. Murray/Mentifex), I used a Ronda-ism
in the form of "govt" for government! In her noble fight on
behalf of "liberte' egalite' fraternite'" and all those other
trifles which probably nauseate you and move you to deride her,
Ronda Hauben mangles the English language and lets you have your fun.
But Ronda Hauben is not a gutless, spineless, complacent wimp."
Dennis Ritchie responded that he had just heard Nobel prize winners from Lucent giving talks there. Another poster wrote, "Do you know that Dennis Ritchie invented C, don't you? Oh good."
A subsequent post complained that with Arthur Murray was flaming and remarked that Ronda Hauben should choose her allies more carefully.
One of the responses was "I am quite impressed with Mr.
Ritchie's accomplishments, but science doesn't accept arguments of authority for good reason."
The person continued: "Everybody makes mistakes sometimes.
Einstein arguably did with the Cosmological Constant, Pauling did with both vitamin C as well as publishing a proposed structure for DNA which met all available X-ray crystallographic requirements but wasn't an acid."
"In my view, Ritchie is being absurdly complacent, even arrogant in proposing that all is well because he is comfortable."
"Nobel prizes are like 'Man of the Year.' They are awarded anyway and the fact that a bunch of Lucent employees may have won them if anything indicates that there isn't as much competition as there should be."
"Lucent doesn't have anywhere near the funding or commitment that its predecessors had in the 1950s and 1960s, and to claim otherwise is absurd. The very fact that it was spun off should serve as evidence of that."
"The fact is that basic, fundamental research in America is in the doldrums, and the ignorant, opportunistic attitudes of most top managers (such as Bill Gates) will keep it there for the foreseeable future unless people bring pressure to change those attitudes."
Several other posts continued the discussion, and Ritchie explained the current situation at Lucent, ending his post, "I won't dispute a general argument that the 'average' research here is somewhat less fundamental than in the past nor that the emphasis has shifted somewhat away from physics and toward software, but the population count and the budget have been remarkably stable." (5)
The discussion moved on to the subject of how "fundamental research in America is in the doldrums."
Another post asked:
"*ahem* However, where the lack of research fits in with the
decision to privatize the Internet naming authority in the US is a
different issue entirely. As I understand it, the issue is
whether or not you can afford to have something as important
and central as that working in commercial conditions. "
In response to the question of what to do about the lack of basic research in the U.S., another poster commented "And where should the people's pressure be directed? Toward influencing senior, executive management in private industry or toward espousal of more government funding?"
Continuing the discussion of the value of basic research, a post explained: "Nearly all research funding is now coupled tightly to patents and short term profits, while visionary products without immediate applicability go begging. The inherent value of understanding and human knowledge is less and less appreciated. We have all but forgotten Franklin's reply to a question about the utility of some new invention: "What good is a newborn baby?"
The discussion then turned to whether or not Microsoft spent money on basic research. And whether a company could afford to spend money on basic research if they didn't get any gain as a result.
In response, Tom Harrington wrote:
"Let me adapt a quote from Benjamin Franklin that John Adams
quoted elsewhere in this thread: Why do we bother paying for
elementary school? Think about it. There's no payoff for
literally years after the money is spent. And a good chunk
of it is likely wasted on children who will grow up never to
contribute to society anyway. And those who do grow up and
help to improve the world do so in unpredictable ways;
there's no way of knowing what problems will be solved, or by
who, when you're looking at the elementary school level. So,
we could classify elementary school spending as going toward
unpredictable, distant goals, and being spent in some
unknown percentage on children who will never help anyway.
Yet we continue to spend money educating small children.
When you understand why we spend money on elementary
schools, you may begin to understand why spending money on
fundamental research is a good idea."
Another poster replied "Very well said!"
Another responded to the comment that a company didn't benefit from basic research by noting that "Plus, repeated studies have shown an average X35 fold return in 'worthless' research."
Another explained that Microsoft's "research" was "on par with 'buying patents' not to implement, but to prevent implementation."
Another added that "Massive economic development tends to help everyone in cases like that. I would bet the benefits over time to AT&T from the development of the transistor far outweigh the research costs. So what if Intel gets some too."
The thread went on to consider the short term outlook of a business plan, and other connected issues. Another post noted that since AT&T was regulated during the period when the transistor was invented, "it was sort of like doing it with tax dollars. " Still another poster had in his signature "Behind every successful organization stands one person who knows the secret of how to keep the managers away from anything truly important."
The people posting were from several countries including Canada, Austria, Britain, the U.S., Norway, and Australia. They included people from different backgrounds and positions, including a government site, university sites, corporate sites, etc. I have referred to this discussion because it shows the broad ranging set of views that the Internet makes possible as all these people can communicate as part of one Internet. And it shows the open forum that Usenet provides for such a discussion.
The discussion through its broad ranging set of posts clarified a fundamental question in the battle over the U.S.
government decision to privatize the central functions of the Internet. That question was identified as "As I understand it, the issue is whether or not you can afford to have something as important and central as that working in commercial conditions." And the conclusion of those who were part of the discussion was that commercial conditions are very shortsighted and thus not able to provide for the long term technological development that benefits a society in the same way as providing elementary schooling for all its citizens benefits the society. Furthermore, the question was raised that when someone understands why elementary schooling for all its citizens is an important public policy provision, they will then understand the need for providing for basic research funding.
In this context, the issue of whether one can trust something as important as control over the Internet to something that is functioning under commercial conditions and business plans is answered in the negative.
The interconnection of networks from around the world welcomes diverse viewpoints by removing the constraints on communication. People from many different networks can communicate with each other and contribute. In this discussion, there were people from 6 different countries, and multiple networks within a few of the countries represented. The Internet provides the environment and varied viewpoints that not only help to frame the real question in a problem like the battle over the U.S. privatization of essential functions of the Internet, but which also provides the means to examine the issues so as to determine a conclusion or to come to a decision about what will be in the best interests of the Internet.
How has such an environment been created? What are the elements of the Internet that contributed to making this environment possible?
III. How has the Internet developed?
From the time of the publication of Licklider and Taylor's article in 1968, to the present time, there have been significant changes in the nature and potential of packet switching networks.
These changes make possible a new kind of cooperative communication among individuals and groups of individuals. This is a communication process which involves users and their computers. It is also facilitated by the internetwork system that those online are part of. However, this internetwork system is transparent. It is essentially hidden from the view of the user.
Thus it is harder to understand how it helps to make communication possible. However, reviewing the history of the development of the Internet can provide an understanding of how it helps to make an important new form of communication possible.
First, it will be useful to review a bit of the history of how the Internet has developed from the events that followed the publication of the paper by Licklider and Taylor in 1968. Then I will explore how the unique characteristics of the Internet make possible a new kind of communication paradigm. This paradigm, it can be argued, is crucial for solving the modern problems of scaling the Internet and managing its essential functions.
Secondarily, understanding this paradigm can help government with the decisions that need to be made in problems like the management of the central functions of the Internet.
On December 23, 1968, a Cambridge based contractor, Bolt Beranek and Newman, Inc. (BBN) was notified that it was to be awarded the contract to build the Interface Message Processor (IMP) minicomputer subnetwork that would provide the packet switching backbone for a prototype packet switching network. The public funding for this research was provided through the Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense. This packet switching network came to be known as the ARPANET. Those working at BBN as part of the ARPANET systems team included Frank Heart, Severo Ornstein, Robert Kahn, Will Crowther, Dave Walden, Bill Barker, Jim Geisman, Martin Thrope, Truett Trach and Bernie Cosell. They worked to design, plan, build, install, operate and test the IMP subnetwork. Time-sharing host computers at selected university or computer research contractor sites were to connect to it and thereby to be able to communicate with each other.
Along with the need for the programmers (many of whom were graduate students) to connect the time-sharing systems at their sites to the IMP subnetwork, was the need for the hosts sites to develop a means of communication among the different hosts. This required that the programmers create a set of conventions that they agreed upon and that they would be able to add to the operating systems of the computers at their sites. Such a set of conventions for interconnecting is called a communications protocol. With support from Larry Roberts who directed the ARPANET project at ARPA during this early period and others working with ARPA or BBN, the programmers created the Network Control Protocol, also known as NCP. They met together at different sites, as the network in these early days did not yet provide the means for the communication to occur online. These programmers from the diverse early sites called themselves the Network Working Group (NWG). The protocol they created was developed through a process of open discussion, where contributions were encouraged from all. As part of this process, they produced written notes documenting their activity. They called these notes Requests for Comment or RFC's. During the earliest period of the activity of the NWG, RFC's were circulated by mail. However, once it was possible to circulate them and even maintain them online, the functioning network itself helped to create an open process to conduct the discussion. One of the earliest RFC's, RFC 3, dated April 1969, explains this open process for discussion and problem solving that grew up with the ARPANET. RFC 3 was by Steve Crocker who was from the UCLA ARPANET site. Crocker writes (6):
Documentation of the NWG's effort is through notes such as
this. Notes may be produced at any site by anybody and
included in this series....The content of a NWG note may be
any thought, suggestion, etc., related to the HOST software
or other aspect of the network. Notes are encouraged to be
timely rather than polished. Philosophical positions without
examples or other specifics, specific suggestions or
implementations techniques without introductory or
background explication, and explicit questions without any
attempted answers are all acceptable. The minimum length for
a NWG note is one sentence.
These standards (or lack of them) are stated explicitly
for two reasons. First, there is a tendency to view a written
statement as 'ipso facto' authoritative, and we hope to
promote the exchange and discussion of considerably less
than authoritative ideas. Second, there is a natural
hesitancy to publish something unpolished and we hope to
ease this inhibition.
This kind of open set of notes helped to create a process of exploring a problem and welcoming contributions toward solving it.
The ARPANET network successfully demonstrated the benefits of using packet switching to transport messages among incompatible computers and incompatible operating systems. In the introduction to the special issue he edited of the "Proceedings of the IEEE", "On Packet Communication Networks", Robert Kahn, writes (7):
Packet switching is a particular form of digital
telecommunications that is well suited to the unique nature
of computer-based communications....Computer traffic occurs
sporadically; it is often described as being 'bursty,' of low
duty cycle, since the intervals between short segments of
transmitted data are relatively long. A packet-communication
network designed to be quite efficient in transmitting
bursty traffic, can provide other functions that are
critical for computer communications, such as error-free
delivery, and code and speed conversion to facilitate
communication between otherwise incompatible terminals....In
summary, packet-switched networks are extensions of the very
nature of computers and computing, offering the same precise
effective means of transporting information that computers
offer in the processing of information."
Kahn, working with Severo Ornstein, and others at BBN, wrote the "Initial Design for Interface Message Processors for the ARPA Computer Network," as the proposal that BBN submitted to ARPA.
Kahn also prepared BBN Report 1822, "Specifications for the Interconnection of a Host and an IMP." He participated in some of the Network Working Group meetings, and was on the distribution list of the earliest RFCs. Another important contribution Kahn made to these early developments was the demonstration he and Al Vezza of MIT organized showing the utility of packet switching networks. This was held at the International Computer Communication Conference in October, 1972 in Washington, D. C. Leonard Kleinrock, another of the important networking pioneers, describes that demonstration (8):
"DARPA installed an IMP in a hotel in Washington, D.C. and
ran in some lines. Everybody was encouraged to create some
demonstration packages, and we did as well. That caused lots
of good things to happen in the ARPA network. It
generated lots of new uses of the ARPA network just for that
demo. One of the things that was demonstrated there was a
distributed air traffic control system. The idea was there
would be a bunch of computers in the network that would be
simulating air traffic control operation in their physical
region. For example, MIT would be doing Boston, and some
Washington machines would do Washington and so on in
Kleinrock describes other uses of the network that were demonstrated at that event (9):
"I remember one of the demos was really interesting. In this
demo, you could sit down in Washington at a teletype, log on to
a machine at BBN, pull up some source code, ship it over to
a machine at UCLA across the country, compile and execute,
and bring back the results to be printed on the teletype
right next to you in Washington."
He remembers the important impact that the demonstration had (10):
"But the point is it was a great demo. People were pulled out
of the hallway, handed a handbook, and told, 'Sit down,
we'll help you use the ARPANET,' and they could....The main
purpose was to prove networking. "
Kahn left BBN in November 1972, just after the successful demonstration, and went to work at DARPA. There he took over the satellite packet network project that Larry Roberts had started and began a packet radio network project. (11)
Kahn wanted to find a way to create a ground based packet radio network and he realized that it would have to have access to resources that would make it of interest to use. He also planned to create a satellite packet network, which would also need to be able to access resources to make it worth using. If these could be connected to the ARPANET, they would be able to access the growing number of interesting resources available on the ARPANET. Thinking through the problems represented by these different kinds of packet networks, and particularly recognizing the differences between the assumptions of the ARPANET packet network and the requirements of a packet radio network, he realized that there was a need for a more general protocol than the one being used on the ARPANET. The new protocol would have to accommodate different kinds of networks rather than accepting the particular assumptions that guided the creation of the ARPANET protocol.
In spring of 1973, Kahn invited one of the members of the NWG, Vint Cerf, to work with him on the creation of a new protocol to make possible the interconnection of networks.
"Around this time," Vint Cerf notes, "Bob started saying, 'Look, my problem is how can I get a computer that's on a satellite net and a computer on a radio net and a computer on the ARPANET to communicate uniformly with each other without realizing what's going on in between?'" (12)
Recognizing that a computer that would serve as a gateway to the diverse networks would solve their problem, Cerf explains: "We knew we couldn't change any of the packet nets themselves....
They did whatever they did because they were optimized for that environment." (13)
"Our thought," he continues, "was that, clearly, each gateway had to know how to talk to each network that it was connected to...Say you're connecting the packet-radio net with the ARPANET. The gateway machine has software in it that makes it look like a host to the ARPANET IMPs. But it also looks like a host on the packet-radio-networks." (14)
Since all the networks had different characteristics, how could messages be transported across them despite these differences?
Other important issues like the problem of how reliability of the transmission of messages would be established had to be determined. These issues were also being considered by others who were part of the International Network Working Group that had formed at the ICCC72 meeting in Washington, in Oct. 1972. Work was being done on these related questions by others in the group, like Louis Pouzin in France who was developing a packet switching network called Cyclades.
By September 1973, Kahn and Cerf had worked out the design for the new protocol that solved the problems they had identified.
They presented it to the International Network Working Group (INWG ) which was meeting in Sussex, England. By November 1973, they wrote the paper, "A Protocol for Packet Network Intercommunication", and submitted it for publication. The paper was published in "The Transactions on Communications" in the May 1974 issue. (15)
The paper describes the principles for the new protocol and presents its essential aspects. Cerf and Kahn write (16):
"A typical packet switching network includes a transportation
mechanism for delivering data between computers or between
computers and terminals. To make data meaningful, computers
and terminals share a common protocol (i.e., a set of agreed
upon conventions). However these protocols have addressed
only the problem of communication on the same network."
The paper describes the new set of issues that had to be considered when creating a protocol that would make it possible to have communication across diverse packet switching networks.
"Even though many different and complex problems must be solved in the design of an individual packet network," Cerf and Kahn write, "these problems are manifestly compounded when dissimilar networks are interconnected." (17)
"Issues arise," they continue, "which have no direct counterpart in an individual network and which strongly influence the way in which internetwork communication can take place." (18) They introduce the concept "internetwork" and they recognize the difficulties that have to be resolved to accomplish not only connection between two networks, but an internetwork communication that would allow any and all diverse packet switching networks to be connected and to provide the ability to have computers and users communicate with each other.
Their seminal paper outlines a number of ways that packet switching networks may differ, and considers alternative ways to interface the diverse networks. (19)
Most importantly, the authors propose that the preferable solution will be to develop a common protocol that can be used in the different networks that agree to communicate. (20) They minimize the role that will be played by the gateway computer which will provide the interface between two different networks.
Recognizing that they will be providing a means for networks "under different ownership to interconnect," they emphasize that "the interconnection must preserve intact the internal operation of each individual network." (21)
The link that will connect two different networks, they call a black box or a gateway. "We give a special name to this interface that performs these functions and call it a gateway," they explain. They assign to the gateway the function of properly routing data.
They call their new protocol Transport Control Protocol (TCP). With the architecture and protocol design described in their paper, they solve the design and other related problems of building an Internet of different packet switching networks.
The significance of their achievement is that they created a means for communication across diverse and different packet switching networks. They, thereby, increased the number of different computers and different operating systems and most importantly of different people who could communicate. Thus they identified the principle which would make possible communication across the boundaries of different packet switching networks. This principle was to provide for the autonomy of the networks that joined together.
The important aspect of TCP was to remove constraints to communication among diverse and different networks, and it has succeeded in a fundamental and important way.
Kahn and Cerf with the help of a number of others, went on to develop the implementations for TCP for a packet radio network, a packet satellite network and to connect them all up with the ARPANET, demonstrating that they worked. (22) The networks were hooked up in 1975 and worked. Kahn says that he cannot remember the exact day nor whether the packet radio network or packet satellite network was first hooked up to the ARPANET. "It should be like V-Day," he notes, but recalls, "When I was doing this, no one else cared. It wasn't viewed as that big a deal." A demonstration of the tri-network occurred on November 2, 1977, connecting up a moving van with a packet radio terminal sending packets into the Arpanet's land lines and then via satellite to Norway and to University College in London. The packets returned via the Atlantic packet satellite network to West Virginia, then to the ARPANET and then to Machine C at UCLA's Information Science's Institute (ISI). Cerf explains that "The packets took a 150,000-km round trip to go 650 km down the coast from San Francisco to Los Angeles," Cerf recalled. "We didn't lose a bit."(23)
Then on January 1, 1983, there was a cutover from the earlier ARPANET protocol NCP to TCP/IP on the ARPANET. (24) And by Fall 1983, the ARPANET was split into two different networks connected by TCP/IP, into MILNET, an operational network for use by the Department of Defense, and into the ARPANET, a research and scientifically oriented network that was functioning in an open environment. This was an operational Internet, each different networks in their own right, and yet also interconnected.
I have provided this brief account of the earliest development of the Internet protocol TCP because it is the development and implementation of TCP (now called TCP/IP) which, as Dave Clark, another Internet pioneer noted, is the glue connecting diverse networks and diverse technologies; I would add, that it is also the glue connecting diverse computers, diverse operating systems, diverse programs, and diverse people into a functioning and unprecedented human-computer communications system that spans the globe. (25) This is the Internet which makes possible the diversity of people from a diversity of networks who contribute to the broad ranging discussion that occurs in Usenet newsgroups and on Internet mailing lists. And it is the principle of recognizing and providing for the autonomy of networks and subsequently the autonomy of peoples that makes a new form of communication possible among the people on the Internet. (26)
IV. The Challenge of Internetting
In a mere 30 years we have come a long way from the important discussion of communication and decision-making made possible by human-computer time-sharing systems in the 1968 article by Licklider and Taylor. And it has taken an army of people, along with their generals, to design, develop, implement, and then test and spread the concepts and implementation of internetting. This is, in an important way, a process of removing the constraints to communication between diverse and different networks and, therefore, between diverse and different people from around the world. Robert Kahn explains that his view of the net is almost equivalent to the ether for speech, i.e. that it shouldn't put any constraints (so that) one could do with the Internet, in essence, what one could do with voice. (27) The Internet, however, is far more powerful.
This leads to the question how to protect and preserve the Internet. The promise of the Internet is that it makes it possible for people to participate in interactive online communities where one can learn fundamental lessons about human to human communication in the process of communicating online. More importantly, the Internet can point the way toward solving the problems that are encountered as it continues to evolve and scale.
In their 1968 article, Licklider and Taylor write that they are deliberately putting their emphasis on people and on how people communicate and how they have observed that computer and time-sharing systems make possible human to human communication.
They point out that they are not interested in just passing information from person to person via a computer. They don't consider that communication, but merely the passive transport of data. For them, communication has to do with the creative process by which something new and nontrivial emerges from the exchange of ideas.
"We believe," they write, "that communicators have to do something non trivial with the information they send and receive." (28)
The importance of their premise is that it provides a yardstick by which to measure whether indeed the human-computer internetworking system that has grown and developed over the past thirty years is making it possible for a more effective form of human communication to occur. (29)
Licklider and Taylor next examine how the process of two people communicating takes place. They propose that people have different models and it is only when they are willing to allow the dialogue to enable them to reexamine their models is communication actually taking place.
My study of online communication, and my experience online and in meeting people in person from online, however, leads me to propose that there is a different paradigm for identifying if communication has taken place. My research has included a number of different forms of early online communication. (30) Also I have discussed problems or concerns online or in person with people I have met online. What I've found is that it is through the free wheeling and rambling discussion that the online medium makes possible, that one can more thoughtfully consider diverse views. The Internet helps to remove the constraints to communication, to make it possible to explore what the underlying dispute or agreement is, and then to determine the new view that will resolve the issue in contention.
I am proposing that the broad ranging discussion made possible by the Internet provides an environment where such considerations can occur. This is possible between two people as in email. Also other formats such as Usenet newsgroups or Internet mailing lists can provide an environment where people with a common interest, from a diverse collection of networks around the world, can participate in a discussion. This helps to generate the diversity of the variety of viewpoints that one has to consider to analyze a question or problem. In this process the wide ranging discussion made possible by th Internet is not limited to two communicators, but can include a large and almost unlimited number. (31)
Licklider and Taylor's article not only proposes how two individuals or groups of individuals communicate, but it also raises the issue of how communication affects government and government decision making. They explain that modern day governments are often confronted with a large amount of data to study and analyze. They propose that the modeling process they have outlined as how communication functions, is too expensive a task for governments to undertake and thus government decisions are often made prematurely. They describe how governments often make policy decisions, without adequate study of important data (32):
"It is frightening to realize how early and drastically one
does simplify, how prematurely one does conclude, even when
the stakes are high and when the transmission facilities and
information resources are extraordinary."
They then propose that in the future the opposite may also be true. They predict: "But someday governments may not be able not to afford it. " (33)
They explain that not only is a communication process a cooperative modeling effort in a mutual environment, there is also an aspect of necessary communication with or about an uncooperative opponent. (34) They write (35):
"As nearly as we can judge from reports of recent
international crises, out of the hundreds of alternatives
that confronted the decision makers at each decision point
or ply in the "game," on the average only a few, and never
more than a few dozen could be considered, and only a few
branches of the game could be explored deeper than two or
three such plies before action had to be taken. Each side
was busy trying to model what the other side might be up to
-- but modeling takes time, and the pressure of events
forces simplifications even when it is dangerous."
Again the study I have done of the kinds of broad ranging discussion that the Internet makes possible leads me to propose that the problem is not modeling toward decision-making. Rather that it is to find a way to have the sufficiently broad ranging and often what seems like irrelevant discussion that will make possible a broadening of the question that is being discussed, so that it becomes possible to clearly identify the problem and then to determine the principles for the decision.
V - Two Examples of the Old Communication Paradigms and the New Competing
An example of an important government decision that was made based on too limited communication and discussion of the implications will perhaps help to clarify how the old and new paradigms differ.
In the early 1990s, a decision was made by the U.S.
government to privatize the NSF backbone to the U.S. section of the Internet. The decision, according to the multiple authors of "A Brief History of the Internet" was made in a series of NSF- initiated conferences at Harvard's Kennedy School of Government on "The Commercialization and Privatization of the Internet" - and on the "com-priv" mailing list on the net itself. (36) However, once the question was framed in this way, the decision was already made and both the discussion at the Kennedy School and on the com-priv list was restricted to how to carry out the privatization. Thus it provided no helpful perspective to either make or evaluate the decision. The problem with this process is that there is no welcoming of diverse views on the basic problem and no discussion allowed with those who disagree.
In contrast to the narrowly focused discussion on the com- priv mailing list and at the Kennedy School meetings, the National Telecommunications Information Administration (NTIA) sponsored an online discussion on the broader issues of how to affect the public policy goal of universal service, access for all and similar topics. (37) This online conference which was accessible via the Internet as a mailing list, and as newsgroups available on the Cleveland Freenet and other such community networks, led participants to identify the question of whether or not the privatization would facilitate access to the Internet for all to at least email, Usenet newsgroups, and a text based browser. Out of the debate came the concern that it was incumbent on the U.S. government to determine that issue before carrying out an action that could deter ubiquitous access for a long period of time or which could cost a much greater amount of public funding than if the government ownership of the NSF backbone was retained to facilitate a less expensive way to provide this important public policy goal.
The NTIA online discussion did not have any obvious effect on the U.S. government decision to carry out the privatization.
This was carried out by April 30, 1995. However, the result that those at the NTIA online conference predicted, that privatization put off achieving the goal of universal access for all to the Internet, has come to pass. Three years later only a small percentage of the households in the U.S. and mainly high income households, have access to the Internet, despite the fact that almost 50% of U.S. households have computers. And there have been estimates that it will cost billions of dollars to connect up certain public sector sections of the population to the Internet, without even considering whether this will further deter universal service and access for all goals that are crucial public policy objectives (38) Also those with access to the Internet are plagued by a slew of unwanted junk mail and junk posts on Usenet newsgroups that is some of the byproduct of the privatization. This presents further obstacles to connection for purposes of communicating via the Internet. And this was one of the harmful effects of privatization predicted by those at the NTIA online conference. (39)
A similar situation is occurring again in 1998. The U.S.
government is claiming that the longer term management of certain essential functions of the Internet is a problem that has to be solved. In a closed process, without the broad ranging kinds of discussion that the Internet makes possible, the U. S. government decided to create a private corporation to whom it would give important and invaluable public assets. These assets effectively give control over the Internet to whoever controls this private corporate entity.
The U.S. Executive Branch has been encountering opposition to its Internet privatization plan from some sectors of the Internet community. It has received very little support for its plan except from a very small sector and from some corporate entities in the U.S. and a few other organizations in the U.S.
and abroad including the Internet society. The U.S. Congress has held hearings on the privatization process. The Chairman of the House Commerce Committee sent a letter to the Chairman of the U.S. Department of Commerce and to the policy advisor to the President of the United States asking for a number of documents toward beginning an investigation into the process. (40) In November, 1998, the National Telecommunication Information Administration (NTIA) signed a Memorandum of Understanding with the private sector corporation they created, ICANN (Internet Corporation for Assigned Names and Numbers) providing for a process to design and test a proposal for the new organization.
However, most of the activities of the new organization are being carried out in a secret way where the decisions being made only reflect a very narrow consideration of options. (41)
Though the NTIA has invited public comments on several issues in this process, they have structured the questions and the process for commenting in a way that has severely limited the range of discussion. For example, the discussion invited in March of 1998 about the Green paper plan for carrying out the privatization of key Internet functions limited the focus of the discussion and thus also of the range of opinions gathered. (42)
A mailing list called IFWP (International Forum on the White Paper) has been set up to discuss how to carry out the privatization, much like the com-priv mailing list that helped to carry out the privatization of the NSF backbone to the Internet in the early 1990s. This mailing list, like the former one, encourages discussion and support for the privatization, and in this way limits the range of discussion that is needed to determine how to even identify the problems that need to be solved.
Also the Usenet discussion described in part II of this paper shows how the broad ranging kind of discussion that the Internet makes possible can clarify the essential question in a public policy issue.
Part VI - The Challenge for Internetting and the Informational Public Utility
By solving the problem of how to make it possible for dissimilar networks to communicate, the Internet pioneers have removed the constraints to communication that the Internet makes possible in a way that is both significant and surprising. This has created a new ability to communicate for those who gain online access to the Internet and Usenet. Such a communications advance was accomplished in part by identifying the requirement that had to be met, which was to not interfere with the autonomy of the different networks and yet to make it possible for all those who wanted to connect to be part of the Internet. The design of the protocol TCP by Robert Kahn and Vint Cerf and the work they did to implement it, along with the contributions of many others from around the world, is a very important and stupendous achievement. But the obligation to safeguard the autonomy of the networks that make up the Internet continues. The creation by the U.S. government of ICANN and its proposed role as a decision maker to set policy for the networks that make up the Internet is a very serious departure from the fundamental principle that makes the Internet possible.
The cooperative forms that have grown up as part of the development of the Internet, like the RFC process or the Internet Engineering Task Force and its cooperative procedures, make it possible to protect the autonomy of the diverse networks of the Internet.
Therefore, the same kind of cooperative online processes that have evolved to support the autonomy of the participating networks of the Internet, are still needed to continue the growth and development of the Internet today. Since the Internet makes a new form of communication possible, this communication can help to clarify the problems when they develop. Similarly, the Internet can be helpful in the search for the solutions. What is needed for problems like the one the U.S. government has supposedly created ICANN to solve, is to create or utilize forms that facilitate communication. However, instead of the recognition that this task is to improve communication between different networks and different people, a structure is being created to block communication and to mandate decisions. Instead of ICANN providing for the needed communication that will make it possible to solve problems, a private corporate structure is being created to constrain communication between the networks and people on the Internet so as to be able to impose decisions that have been created by unknown individuals and unknown processes and which are in the interests of a very small set of people.
However, there is a need to determine how to remove the constraints to communication between those administering the essential functions of the Internet, and the people from the diverse communities and diverse networks who are part of the Internet community. How this is to be done needs to be studied and determined, but it involves study both of those administering the essential functions of the Internet and of the the people and networks that make up the Internet community today. To create a better interface between these two entities, one must identify what the problem is and formulate it in the way that networking pioneers were able to clarify the problem of interconnecting diverse networks to create the Internet. Also there is a need to examine whether to create a Usenet newsgroup or newsgroup hierarchy and to determine how it might be helpful to carry out the functions that are needed in assigning names and numbers for the Internet. In general and where possible, decisions should be made at a grassroots level by an open process involving those administering the technical function in discussion with the Internet community. This can only function if there is a structure that is open to all who want to participate from the myriad of networks in the Internet community and which can hear from others with concerns or problems about the decisions that are to be made or have been made. An online forum on Usenet, where those administering the functions participate would help make it possible to bring up any problems, get help clarifying them, and have the discussion analyze the problems that have to be solved. However, for this to function, there must be a way to protect the process from those who are trying to gain commercial advantage from the decisions, at the expense of what is in the best interests of the whole Internet community.
Recognizing the social problems that would arise and need to be solved when the network of networks they were planning would be built, farsighted computer pioneers like J.C.R. Licklider and H. Sackman proposed the need to study and give proper attention to public policy issues for the developing computer utility.
At a conference on the Informational Public Utility, held in Chicago, in 1970, Harold Sackman, explained why the concept of a public utility was an appropriate concept for administering the network of networks that they foresaw would develop. He explains(43):
"The concept of public utilities is as old as urban
civilization. Recall the great irrigation works of Egypt and
Mesopotamia, and the renowned highways and aqueducts of the
Roman Empire. The concept of public utilities as we know
them today emerged with the advent of the industrial
revolution and western democracy. Many vital services were
at first privately owned, such as transportation,
communication, water supply, sanitation, power and light.
The pressure of growing urbanization created greater needs
for adequate utilities. Widespread abuses were uncovered
under the protective umbrella of unrestricted monopolies for
private owners, with cutthroat competition among such owners
to obtain exclusive monopolies. At about 1840, a concerted
revolt occurred at local governmental levels against the
prevailing laissez faire doctrine in England and America.
After many fits and starts, the modern concept of the public
utility emerged -- as a public service typically (but not
always) managed by private enterprise under a franchise, and
under explicit public regulation by duly constituted
Sackman goes on to describe the way that sound regulation needs to be developed, which is through a process of exploring what will be functional by setting up a prototype and examining how it meets the required needs. He proposed having some finite process of exploring whether a regulation would be helpful or would need to be revised to meet the problems that were encountered during the test case using it. Sackman noted that there wasn't at the time the necessary experience in developing such regulations, but he proposes the process needed to determine such regulations (44):
These considerations converge into a single, fundamental
recommendation -- the need for cooperative, experimental
computer utility prototypes to formulate the problems,
develop the techniques, and gain the experience necessary
for intelligent regulation and growth of this new social
Sackman raises the question whether the public interest had to be considered -- "not merely the interest of the computer industry, nor that of the communications carriers, nor that of governmental agencies, but the interest of all the people?"(45)
"It might be argued," he continues, "that dedication of computer utilities to free and enlightened knowledge in the public domain could lead to a wiser and more enlightened citizenry, and to a higher standard of living for all through the release of latent effective intelligence. It might further be argued that such universalization of information services might lead to greater individual fulfillment in a more humane world." (46)
He proposes the need for creative approaches to the problem that encourages the active participation of the citizenry in determining the solutions to the problems that the information utility will create.
What he proposes is similar to the theme of others who spoke at this 1970 computer conference. J.C.R. Licklider gave the keynote at the conference. His talk raised the question of what the future impact would be of the kind of network of networks that would soon be a reality. Licklider predicted: "The computer and information utility of the future may be a "network of networks." (47)
He also explains that this kind of network will make it possible for computers to talk with one another, and for people to talk with computers, and through the computers and networks, for people to talk with each other.
At the conference Sackman warned how it would be disastrous to leave the determination of decisions about the developing network of networks to the concern for commercial objectives.
"If immediate profits," he wrote, "are the supreme end of all social planning because no other serious contenders arise, then the information utility could end up as the most barren wasteland of them all." (48)
Clarifying the nature of the information utility, Licklider explained (49) :
An information utility is certainly a meld of computation
and communication. I think it is made up of three parts
computation and one part communication. The computation
parts are processing, storage, and interaction between man
and machine. The communication part, of course, is
transmission of information. Perhaps we should recognize a
forth ingredient, the information itself."
Whether we agree or disagree with Licklider's component parts or his categories, or about which is more appropriately considered computation or communication, Licklider's conceptualization of the information utility as the computer communications system of the future, and hence of the Internet today, is quite helpful.
The 1970 AFIPS conference with the numerous talks about the inportance of enlightened government social policy to be developed to determine how the information utility will be administered, is an important document. It shows the concern and foresight of computer pioneers like Licklider and Sackman. They stressed the need for the understanding of how important a responsibility it is for the computer science community and for citizens to work for good social policy to direct the development and administration of a network of networks. They recognized that there was a contest and that the outcome of the contest would either lead to a great leap forward for mankind or to a leap backwards, depending on whether the enlightened government activity that was needed could be achieved.
Licklider, describing how this choice hung in balance, compared the problem to a switch. He warned(50):
Thus though the crux is a switch, it is not a switch in a
level track. One branch goes down, one up. It's a choice
between data and knowledge. It's either mere access to
information or interaction with information. And for mankind
it implies either an enmeshment of silent gears of the great
electrical machine or mastery of a marvelous new and truly
plastic medium for formulating ideas and for exploring,
expressing, and communicating them.
Today we do indeed have the marvelous new and truly plastic medium for communication that Licklider predicted and we also have the responsibility of determining the future of this important social and technological treasure. Will we heed the warning of Licklider and others of his generation who so clearly saw the challenge, that the development of the Internet presents to our society? Will the challenge be properly taken up so we can indeed proudly welcome in the new millennium? The Internet and the new means of communication that it makes possible fortunately provides us with the ability to meet the challenge.
(1) Science and Technology, April, 1968.
(2) Memorandum MAC-M-23, April 25, 1963, Memorandum for: Members and Affiliates of the Intergalactic Computer Network. from J.C.R.
(3) A URL for the discussion is at
(4) This discussion was important in many ways as it involved a number of people from diverse communities in a discussion of the importance of and the need to support basic research.
A second reason this discussion was especially important is that the discussion was free wheeling. People were willing to cooperate in exploring the issues raised.
(5) Ritchie explained earlier in the post: "Rich Rashid was around a couple of weeks ago, and said that there were 300+ employees in MS research
There are roughly 1200 employees in Bell Labs research, split approximately equally between physical and information sciences.
The company is committed to spending 1% of revenues ($7.2B last reported quarter) on the activity...."
(6) See Michael Hauben's "Behind the Net: The Untold Story of the ARPANET and Computer Science" in Netizens: On the History and Impact of Usenet and the Internet for further description of the development of the Network Working Group and an example of an early RFC. The distribution list of RFC 3 was
1 Bob Kahn, BBN
2 Larry Roberts, ARPA
3 Steve Carr, UCLA
4 Jeff Rulifson, UTAH
5 Ron Stoughton, UCSB
6 Steve Crocker, UCLA
(7) Robert Kahn, "Scanning the Issue: Special Issue on Packet Communication Networks," "Proceedings of the IEEE", Vol. 66, No.
11, pg. 1303.
(8) "An Interview with Leonard Kleinrock", Conducted by Judy O'Neill, 3 April 1990. Charles Babbage Institute, The Center for the History of Information Processing, University of Minnesota, Minneapolis.
(11) From an Interview with Robert E. Kahn, conducted by Judy O'Neill, on April 24, 1990, Reston Virginia, Charles Babbage Institute, Center for the History of Information Processing.
When I got there (DARPA-ed)there was money budgeted for a packet radio program, and I undertook to make it happen. The skids were all greased for that. Part way through the first year of the program it became clear to me that we were going to have to have a plan for getting computer resources on the net. In 1973, mainframe computers were multi-million dollar machines that required air-conditioned computer centers. You weren't going to connect them to a mobile, portable packet radio unit and carry it around."
"So my first question was "How am I going to link this packet radio system to any computational resources of interest?" (Kahn had just succeeded in solving that question with the ARPANET at the ICCC72 show-ed.) Well, my answer was, "Let's link it to the ARPANET." Except that these were two radically different networks in many ways. I mean, all the details were different I don't mean conceptually they were different. They were sort of the same genre Just like, say Chinese and Americans are of the same genre except one speaks Chinese and one speaks English, one lives on one side of the world, one lives on the other side, they go to sleep during your daytime, etc. The details of the two networks were rather different. The ARPANET ran at 50 kilobits per second and the packet radio system ran at 100 or 400 kilobits per second One had thousand bit uncoded packets; the other had two thousand bit packets which could be coded. The ARPANET assumed that once you sent something it was delivered with a hundred percent reliability. The other assumed that much of the time you would never get anything through even though the system was working.
The protocols that were designed for the ARPANET wouldn't work over the packet radio net because when a packet entered the packet radio net, the only thing the ARPANET would have told it was where it came from but not where it was going. So the packet radio net had no further information to know where to route it.
If a packet got lost along the way, the ARPANET hosts would come to a halt. Well, in a radio net you can get interference and so some loss is natural So we really had to rethink literally the whole issue of host transport protocols. Vint Cerf and I jointly came up with the TCP/IP concept as a new transport mechanism as part of an architecture for internetworking. DARPA then gave a contract to Vint at Stanford to actually implement the TCP/IP concept - along with small efforts at BBN and at University College London. Vint had the lead for developing the specification."
(12) Katie Hafner and Matthew Lyon, "When Wizards Stay Up Late: the origins of the Internet", New York, 1996, pg. 223.
(15) Vint G. Cerf and Robert E. Kahn, "A Protocol for Packet Network Intercommunication", IEEE Transactions on Communications, vol Com-22, No 5, pg. 637-648. The authors called the protocol TCP in their paper (Transport Control Protocol), but later a part of the protocol was split off into a separate protocol and called IP, and the name for the protocol then became known as TCP/IP.
(16) Ibid, pg. 637.
(19) Ibid. pg. 338.
The authors write:
"It would be extremely convenient if all the differences
between networks could be economically resolved by suitable
interfacing at the network boundaries. For many of the
differences, this objective can be achieved. However, both
economic and technical considerations lead us to prefer that
the interface be as simple and reliable as possible and deal
primarily with passing data between the networks that use
different packet switching strategies."
(20) Ibid. They explain the rationale for their choice: Their conclusion is, they explain:
"We obviously want to allow conversion between packet
switching strategies at the interface, to permit
interconnection of existing and planned networks. However,
the complexity and dissimilarity of the HOST or process
level protocols makes it desirable to avoid having to
transform between them at the interface, even if this
transformation were always possible. Rather compatible HOST
and process level protocols must be developed to achieve
effective internetwork resource sharing."
"The unacceptable alternative is for every HOST or process to
implement every protocol (a potentially unbounded number)
that may be needed to communicate with other networks. We
therefore assume that a common protocol is to be used
between HOST's or processes in different networks and that
the interface between the networks should take as small a
role as possible in the protocol."
(21) Ibid., p. 638.
(22) Some of these others included Ray Tomlinson at BBN, Peter Kirstein at University College, London, and dozens of graduate students including Daryl Rubin.
(23) John Adam, "Architects of the net of nets," IEEE Spectrum, September 1996, pg. 61. Adams writes: "Minicomputers were used as gateways between networks. Owners did not need to alter their networks, but hooked up to a black box to handle outside connections.
"On November 22, 1977 Vint Cerf with a crew of others demonstrated a triple-network Internet. "Radio repeaters dotted the hills around Menlo Park, so that a moving van with a packet radio terminal could send Internet packets into the Arpanet's land lines and through satellites to Norway and University College, London. The packets then returned through the Atlantic packet satellite network to West Virginia back into the Arpanet where they hopped to Machine C at UCLA's Information Sciences Institute (ISI) in Los Angeles. 'The packets took a 150,000-km round trip to go 650 km down the coast from San Francisco to Los Angeles,' Cerf recalled. 'We didn't lose a bit.'" (pg 61)
(24) The cutover is described in the draft paper (need title) at http://www.ais.org/~ronda/new.papers
(25) In 1996, Adams reported that there were more than 94,000 networks connected as part of the Internet and the number was growing exponentially.
(26) In the Federal District Court Case on the Communications Decency Act, Judge Dalzell issued an opinion where he noted the autonomy of the users on the Internet and advised the U.S.
government of its obligation to protect the autonomy of the common people as well as the media magnates. The Federal Court Decision striking down the CDA for interfering with that autonomy, was affirmed by the U.S. Supreme Court.
(27) paraphrase of statement by Robert Kahn.
(28) Licklider and Taylor, p. 21
(29) See also J. C. R. Licklider, "Communication and Computers," in "Communication, Language, and Meaning: Psychological Perspectives", edited by George A. Miller, New York, 1973, p.205-6. Licklider writes:
"The computer has not yet had much effect upon human
communication, but I think that in a few years it will have
a tremendous effect. I believe that people will communicate
through networks of interactive multiaccess computers,
making use of programs similar to those already described
as aids to thinking--variants of those programs designed to
interact simultaneously with two or more users."
(30) My research studies have included one of the earliest online mailing lists on the ARPANET, the MsgGroup mailing list from the 1975-1980 period, early Usenet newsgroups from the 1981-1983 period, and ARPANET mailing lists from the 1981-1983 period as well.
See papers at http://www.ais.org/~ronda/news.papers
(31) Sometimes the discussion on Usenet can include over a hundred different comments, often by 3/4th that number of people, which leads to the kind of broad ranging perspective needed to consider an issue. An example was a discussion on Usenet when the U.S.
Congress passed the Communications Decency Act that had more than a 100 comments in it. Also when people who have experience on Usenet meet in person they often have an easier time than other people would exploring an issue where they differ as they have grown used to recognizing that differences are a treasure to explore rather than becoming hostile to them.
(32) Licklider and Taylor, pg 24.
(34) Ibid., pg. 24.
(35) Ibid. pg. 25.
(36) "A Brief History of the Internet" by Barry M. Leiner, Vinton Cerf, David D. Clark, Robert E. Kahn, Leonard Kleinrock, Daniel C. Lynch, Jon Postel, Larry G. Roberts, Stephen Wolff.
See also "Imminent Death of the Net Predicted!" by Ronda Hauben, in Netizens: On the History and Impact of Usenet and the Internet by Hauben, Michael and Hauben, Ronda, IEEE Computer Society Press, Los Almitos, CA., 1997.
draft version at: http://www.columbia.edu/~hauben/netbook/
(37) See chapter 11 "The NTIA Conference on the Future of the Net: Creating a Prototype for a Democratic Decision-Making Process" and chapter 14 "The Net and the Future of Politics: The Ascendancy of the Commons" in Netizens: On the History and Impact of Usenet and the Internet. NTIA online conference at the NTIA web site. http://www.ntia.gov/
(38) See Netizens: On the History and Impact of Usenet and the Internet, pg. 216. Steve Wolff who was then head of the NSFNet, at a meeting in 1990 about privatization, is quoted saying "it is easier for NSF to simply provide one free backbone to all comers rather than deal with 25 mid-level networks, 500 universities, or perhaps tens or hundreds of thousands of individual researchers." The Report that was then online describing the 1990 conference at the Harvard Kennedy School of Government noted that the privatization would probably lead to only the wiring of the geographical areas where companies could be confident of high profits, which would be large metropolitan areas with a high percentage of Research and Development facilities. This practice of only providing access in areas that companies believed would be highly profitable is known as cream-skimming. Thus the decision to privatize was understood to be contrary to the public policy goal of providing access for all to the Internet.
(39) In "A Brief History of the Internet" the authors note that the NSFNET program cost the U.S. taxpayer at least $200 million from 1986 to 1995. That during "its 8-1/2 year lifetime, the backbone link had grown from 6nodes with 56 kbps links to 21 nodes with multiple 45 Mbps links. It had seen the Internet grow to over 50,000 networks on all seven continents and outer space, with approximately 29,000 networks in the United States." Similar large amounts of taxpayer funds were spent for the development of the ARPANET. Thus the goal of access for all to the Internet as a new means of communication is a fitting obligation of government in return for utilization of taxpayer funds to create the ARPANET.
(40) See Letter to the Department of Commerce by the Chairman of the U.S House of Representatives Commerce Committee Congressman Bliley at http://www.house.gov/commerce Also see Letter to Congressman Bliley at http://www.columbia.edu/~rh120/other
(41) See the Memorandum of Agreement (MoU) at http://www.ntia.gov
Also see articles written during the battle at http//www.columbia.edu/~rh120/other
(42) See NTIA web site for Green paper discussion http://www.ntia.gov
(43) H. Sackman, "The Information Utility, Science and Society," in "The Information Utility and Social Choice", Sackman and Norman Nie, editors, AFIPS Press, Montvale, 1970, pg. 157.
(44) Ibid., pg. 158.
(45) Ibid., pg. 159.
(47) J. C. R. Licklider, "Social Prospects of Information Utilities", in "The Information Utility and Social Choice", pg.
(48) Sackman, pg. 144.
(49) Licklider, "Social Prospects of Information Utilities", pg. 6.