Patchwork quilt of protocols
We were a bit mystified when an early email mentioned that we could use ‘the Internet’ to get to some resources. What was the Internet? It was then we found out that the network of computers in the US that we were connected to was being called the Internet. Tony Dale, support consultant for Canterbury University’s Information and Communications Technology Department.
The NZPO failed to tune in to the hum coming out of universities and scientific establishments in the United Kingdom, United States, and New Zealand’s own research organisations and campuses. Exponential growth in telephone use and burgeoning demand for data circuits should have sounded an early warning.
The government, it seemed, had become complacent about maintaining and upgrading the core telecommunications infrastructure. essay
Telecommunications was no longer the prestigious government portfolio it had once been. It was handled largely by more junior ministers and investment had begun to tail off. By the mid-1980s there was massive congestion on the Post Office network. In Auckland the exchange was verging on collapse and across the country there were frequent network crashes. Limited by government control over what it could invest, the Post Office became increasingly inefficient. There was little incentive to improve. After all, it had a complete monopoly on all the activities it was engaged in, from delivering mail and telegrams to phone, data services, and faxes.
The government began to look at ways to improve the services offered to domestic and business customers. It embarked on a series of reforms to try to place this foundering but essential business on a more commercial footing. Under the SOEs legislation the Post Office division that dealt with telecommunications was overhauled in preparation for sale.
While large corporations and government departments were learning about the limitations of the proprietary protocols that linked their growing armoury of mainframe and mini computers, IT staff and computer science experts across the nation’s universities were still trying to figure out how to network their disparate systems. Various clumsy attempts had been made to connect computers, departments, and even university campuses as early as 1975. From 1980 the old Burroughs B6700 mainframes in New Zealand’s universities were gradually replaced with more incompatible equipment, including systems from Prime, DEC, and IBM, further frustrating the vision of a university-wide network. When the heads of the computer services or computer science departments got together, talk would inevitably turn to how to maximise the use of existing resources and share information. The boffins knew there must be a better way and without any government input, continued to innovate and experiment.
WAN, and any form of national or international connectivity, was still very much a store-and-forward business. A user would send a message, which would be uploaded when the computer next linked to a remote host. When the destination computer next logged in, possibly using an incompatible system, the file would complete its journey. The theoretical network that might one day expand to embrace the world was most often referred to as WorldNet.
Coloured book confusion
Research-based networks were emerging in the United States and United Kingdom. They were initially built for closed communities of scholars or academics and were largely incompatible because they used different communications methods. For example AT&T’s wide dissemination of the Unix operating system had given rise to the Usenet news service, based on its built-in UUCP communication protocols. New Zealand universities, aware of growing offshore repositories of valuable research data, often had magnetic tapes of content mailed to them.
In 1981 Ira Fuchs at the City University of New York and Greydon Freeman at Yale University had come up with Bitnet[1], which linked academic mainframe computers. It was a point-to-point store-and-forward network with email messages and files transmitted in their entirety from one server to the next. Bitnet’s network job entry (NJE) protocols were used for the enormous IBM internal network known as VNET, which originally ran at 9.6kbit/sec. The protocols were eventually ported to non-IBM mainframe operating systems and became widely implemented under DEC’s VAX/VMS operating system.[2]
Janet, a UK Government-funded computer network dedicated to education and research, had been developed by British university networks in the 1970s. Hosted on an X.25 research network, it went live in April 1983 with 50 sites accessible at line speeds starting at 9.6kbit/sec.[3] Janet helped standardise the Coloured Book protocols that made up the first complete X.25 standard. The UK-centred approach resulted in host names that were opposite in layout to those determined by researchers on the ARPANET Internet project; for example, uk.ac.universityname.comp rather than science.universityname.ac.uk. This made it complex and confusing for those who wanted to access both. Many were convinced that Janet, loosely based around the emerging layers of the open systems interconnect (OSI) seven-layer communications model, would lead the way as an international standard.[4]
In the early 1980s research at the US-based ARPANET was tightly contained until it was refined in a peer group process and its protocols and utilities began to be distributed more widely. Word was soon out that the ARPANET experiment had succeeded and visionaries across the IT, academic and science and research communities were logging on as fast as they could muster the resources. Soon the fledgling Internet became a victim of its own success, with more computer ‘hosts’ linked than ever envisaged. By 1984 there were 1000 hosts, mostly in the United States, and the volume of traffic was growing exponentially, largely due to email. Some even predicted the whole anarchic system was on the verge of collapse.
In 1984 when the British government launched Janet to serve its universities, the US National Science Foundation (NSF), determined to make Internet available to all ‘qualified students,’ established NSFnet. International networking was clearly headed for the mainstream; the only uncertainty was which of the evolving protocols would be the first to reach critical mass.
TCP/IP the challenger
In the United States there was no question. The use of TCP/IP protocols rapidly became mandatory. It had been adopted as a US Defense standard in 1980, enabling technology sharing for research and development and operational purposes. On 1 January 1983 a carefully planned transition took place, shifting the Internet away from its military origins, with all hosts required to convert simultaneously to the TCP/IP protocol. Lapel buttons declaring: “I survived the TCP/IP transition” appeared spontaneously across the Internet community. The NSF invested major funding in faster computers and backbone networks. Every government agency jumped on board, and gateways were quickly established around the world. The Internet was growing up and out.[5]
US Government agencies shared the cost of ‘common infrastructure’ such as undersea and international connections and gateways. NSFnet now had shared infrastructure agreements with other scientific networks (including ARPANET), promising there would be ‘no metered costs.’ It agreed to support the Internet Activities Board (IAB) in future research, provide the backbone for the US Internet service, and purchase five supercomputers to service the traffic. The first computers delivered 56kbit/sec[6] but access excluded any “purposes not in support of research and education.”
The creation of NSFnet had an immediate and dramatic impact, removing the bottleneck from the original Internet system and encouraging a major upsurge in use. It had taken a decade for the Internet to grow from four computers to 1000. By 1986 there were 5000 connected hosts and a year later that number had skyrocketed to 28,000. The decision to exclude commercial users sparked an upsurge in private investment and development in the US network infrastructure. As businesses realised they lacked the knowledge required to leverage the TCP/IP software bundled with their computers, they began taking on skilled people from academic and research backgrounds.
From 1985 mainstream computer vendors began looking at their own networking products with a view to making them more interoperable. In 1987 UUNET, the first subscription-based commercial Internet company was founded.[7] While the Internet was gaining rapid ground in the academic and science sectors it remained a mystery to most due to its arcane disciplines, complex code, and complicated interface and commands. Finding information in this growing maze of data was not for the uninitiated, although there were a growing number of tools designed to make it a little easier for those who could grasp the concepts. There were few options in the way text-based content could be viewed: a black or green screen and one typeface, Courier. Downloading data was painfully slow. Newsgroups, where like-minded, certainly technically minded people could confer with each other, and file transfer protocol (FTP) downloads quickly became the killer applications.[8]
Because it’s there
When Auckland and Victoria universities moved up to the next level of mainframe computing with IBM 4341 machines from 1980, they began meeting regularly with IBM technical people to discuss how they might start networking. They discovered that RSCS (remote spooling used for running remote printer files) in IBM’s VMS operating system could be made to run mail files between machines. Victoria University went so far as to establish its first campus network based on the RSCS protocols. This opened up the possibilities of connecting to the IBM-focused international Bitnet (“because it’s there”) community.
“The reason Bitnet worked was the joining criteria. You could join for the cost of laying the cable to the person you wanted to connect to, but you had to allow at least one other organisation to do the same to you. It wasn’t possible to be a freeloader. That was how it quickly spread across all the North American universities,” explained Mike Newbery, who by this stage had gone from physics graduate to managing the Burroughs terminal ‘network,’ and joining the Victoria University CSD.
Newbery had cobbled together an email package, based on his own observations, and technical specifications for a 1976 request for comment posting on ARPANET.[9] Written in the REXX programming language[10] and initially designed to communicate with different terminal users on the old Burroughs mainframe, the software began to be used across the new campus network. While it was easy for Victoria to get into Bitnet because it had the IBM Communications Controller box, it was a little more difficult for Auckland University, which had to tweak code on the terminal multiplexer it had acquired from Yale University to run on its IBM Series 1 mini computers. This meant Auckland could finally run cheap ASCI terminals rather than using expensive IBM 3270 terminals.
The inter-university link operated perfectly for some time until it raised a political question from those outside the university computing fraternity: ‘Why on earth would Auckland computers want to talk to those in Wellington?’ “Engineering and computer science people understood the potential, and people were used to having email from the same computer but not at other sites,” said former Auckland University head of computing, Nevil Brownlee. “We were all aware of the Internet, and with the VAX cluster we ran from the mid-1980s we had mail forwarding with other sites via Pacnet, with all sorts of tricky gateways into other networks. But the Post Office charged heaps of money per minute for a Pacnet connection.”
While he’d helped cobble together the breakthrough technology that enabled the DSIR’s computers to interact with the huge GCS mainframes in the late 1970s, Frank March was to dive in even deeper in the early 1980s. His first encounter with networking had been with a large IBM system at the CCLRC Rutherford Appleton Laboratory (RAL) in Oxfordshire. In 1984 he was back at this laboratory on study leave shortly after Janet was announced. He wanted to see how this new development might apply to the DSIR in New Zealand.
Over a six-month period March made himself an expert on the Coloured Book protocols being used in the newly launched British academic network. On the way home he stopped by the laboratories of BBN Technologies (originally BBN) in Cambridge, Massachusetts.[11] The independent research and development firm was under contract to the National Bureau of Standards to do compliance testing for both the ISO’s OSI protocols and for the TCP/IP stacks. “I asked them somewhat naïvely, ‘And when do you expect TCP/IP and the ISO protocols to converge?’ They fell over themselves laughing and said that it would never happen. TCP/IP is the future and the OSI protocols will never amount to anything.”
March stood his ground. “I didn’t believe them. I thought, they’ve got to be wrong. In fact when I came back to New Zealand there was a huge debate going on about which way things should go, and Coloured Book still looked good. It was far from clear at the time because there were four or five competing international networks.” Coloured Book wasn’t easy to understand and he had some reservations, particularly when he stumbled across the TCP, which he understood. It looked very similar to the Australian developed NodeCode, which the DSIR network ran. It was the interconnection protocol (later IP or Internet protocol) that he encountered for the first time at BBN that threw him. “The Brits weren’t interested in TCP, and determined to ignore Ethernet and stay with the emerging ISO standards.” So he avidly absorbed as much of the TCP documentation as he could.
When March returned to New Zealand in March 1984, he was convinced of only one thing: Ethernet was going to take over from all the rest of the network transport approaches because it was using integrated circuits for its cards and beginning to be mass produced. All the alternatives used discrete components and cost the earth to connect – about $20,000 per station.
In 1980, when government rules relaxed, the DSIR was authorised to buy a DEC VAX 11/780 system for its Gracefield laboratories, which in March 1981 was linked to the DSIR network. A year later another went into the AMD. Interactive access was provided at maximum speed of 9.6kbit/sec. Internal campus wiring gave local users direct access, while remote users could contend for a limited number of physical ports. “In comparison to the IBM and ICL systems, the VAX was far more user friendly and proved to be extremely successful,” said DSIR IT manager Dr Peter Whimp.
According to March few had heard of email, and it was only introduced into the DSIR as an interesting artefact on the side of the VAX operating system. “Only when the second VAX was introduced did connecting the email systems become our priority. At that stage DECnet didn’t exist, TCP/IP didn’t exist and VAXs did not automatically network. They ran terminals connected to each one, but the idea of connecting two VAXs together certainly wasn’t part of the operating system.” Once more Kiwi ingenuity came to the fore: if you can’t buy one, make one. “We wrote our own system within the DSIR network, which allowed the two machines to exchange information and send and receive mail. When the third VAX arrived email went from an unknown service to essential in 12 months.”
While March and everyone else on the team could see the desirability of sending email outside the DSIR, no one was sure how to do it. In fact there was a meeting at Victoria University in 1985, towards the end of his term as DSIR network manager, to discuss computing issues such as setting up a common email system between universities and the DSIR. “I remember somebody said, ‘Why on earth would we want to email other universities?’ To hear that, at a time when John Hine and his group at Victoria University were experimenting with emailing to North America and starting to get all this news down, flabbergasted me,” said March.
Canterbury connections
Around 1982 Rob Harrington, a Canterbury University engineer and a member of the board of the local branch of Digital Equipment Corporation Users Group (DECUS), established a dial-up email link with a DECUS counterpart in Sydney.
At the heart of the system was a Pascal memo distribution facility (PMDF) package that could manage mail and figure out where to send it. Using acoustic couplers (early modems), a link was established with the DEC mail hub across the Tasman, which enabled DECUS board members to dial in and exchange messages. “The board was really pushing the parameters in terms of what was available in achieving that link,” recalled Harrington. Canterbury had replaced its old Burroughs mainframe with a Prime system, which talked X.25 but “it was a dog of a machine.” Then the engineering people bought a Microvax and converted the campus network to Ethernet.
Things got a little simpler when the Post Office launched Pacnet because this was compatible with the X.25 communications protocol that the DEC VAX computers used. This became the preferred means of exchanging messages between DECUS members locally and in Australia.
Even more entrepreneurial, however, were the efforts of the Computer Science Department under Dick Cooper, who established one of the earliest international links for news and email. In 1985 the Computer Services Centre established a connection from its Prime 750 computer to the Post Office Pacnet X.25 network at 10kbit/sec. Robert Biddle, a PhD student in the CSD, set up a crude email system that required the user to log in to other university computers to send mail. The plan was that everyone would eventually go the Coloured Book way. However Biddle, with help from programmers Ken Lalonde and Alan Bowler, used the connection to log in to the Mathematics Department at the University of Waterloo in Canada, where he had been a student. “I made a crude interface that transferred mail between the Prime and Waterloo and beyond.” [12]
In 1986, the Computer Science Department spent its furniture budget on a DEC VAX 750, installed Berkeley Unix (BSD4.2) and named the system ‘Cantuar.’[13] A program running on the VAX interacted with the Prime 750, so Cantuar could also use Pacnet to log in to offshore UUCP networks. “This was a really hacked connection, requiring special daemons[14] to communicate using a weird protocol, to get around the very weird Prime X.25 interface. But it worked, and we used UUCP to connect to anything else that talked X.25; because of the Prime, however, we could only connect out and not in,” recalled Biddle. The Pacnet (X.25) cost structure had one rate for New Zealand, another for Australia, and another for the rest of the world. So UUCP connections were made of these zones including North America and Europe.
Canterbury ended up with UUCP connections to the University of Waterloo in Canada, Victoria University in Wellington, Melbourne University, and the Mathematical Centre in Amsterdam. The connections were in regular use for some time and further temporary links were made to Christchurch companies, to Lincoln College, and Hong Kong. “The Hong Kong connection (hkucs) was established because we had a long-term visitor from Jinan University in Guangzhou. At the time Jinan had no outside network connections so it seemed like a neat thing to do. However we had great difficulty using the telephone connections to establish that link so we ended up terminating in Hong Kong instead,”[15] said Biddle.
Tony Dale, support consultant for Canterbury’s Information and Communications Technology Department, said because the Pacnet connection only ran overnight, email and news was only received once a day. “If you wanted a specific file you could do an email request to a server and the file would be sent back via email. It was expensive, however, and after a couple of years on-line we were spending over $1000 a month just to get a few megabytes of traffic. We were a bit mystified when an early email mentioned that we could use ‘the Internet’ to get to some resources. What was the Internet? It was then we found out that the network of computers in the United States that we were connected to was being called the Internet.”
Shortly after the main UUCP links became stable, Biddle recalled Canterbury made an entry in the worldwide UUCP routing maps. “I got a message from a US military base somewhere on the Pacific Rim, from someone who had seen our entry and realised we were close to the US installation at Christchurch Airport, used for the US Military Airlift Command. He said that he was involved in shipping things through there on their way to Exmouth, in Australia, which I found out is a US submarine base. He wanted to know if we would allow people at the Christchurch base to have access to our system, so he could communicate with them about these shipments. He intimated he would ship some things to us in exchange. The scheme didn’t go much further, although I have always wondered what manner of things he might have arranged to send us.”
Victoria University was not only a UUCP site, it was a node on the Australian SUN-III network, a Sydney University initiative with its own homegrown protocols and an X.25 link. VUW used this connection to get news, and in turn Canterbury got its UUCP news from VUW’s Computer Science Department. “It was frequently flaky, because VUW didn’t have X.25 on their machine either, and we had to connect via daemon on a VMS/VAX in the Computer Services Centre. We also got some newsgroups from Waterloo and Melbourne when these weren’t picked up by Victoria, or when the faster speed was important to us,” said Biddle.[16]
During 1985–1988 there was much confusion about email and news. Lots of people wanted connectivity but university computer centres insisted on following the UK Coloured Book or Spearnet (South Pacific Educational and Research Network) models, or approaches encouraged by various computer companies. “At Canterbury, our department felt it couldn’t operate email for the university as a whole; that was the Computer Service Centre’s job. Yet they didn’t see the world the way we did, so it was pretty frustrating. At the same time the Post Office was trying to sell its Starnet service, which originally involved each user having X.25 to connect to a central machine in Sydney. Some government offices got Starnet and regarded it as a luxury, even reprimanding people for using it for non-urgent business. When I explained that it should be ‘cheaper’ than mail, most people just didn’t believe me. Then fax machines came along and made things even more confusing,” said Biddle.[17]
A light bulb moment
After struggling with the old protocols and piecemeal attempts at networking at Victoria University, where he’d lectured in computer science in the late 1970s, John Hine had taken research leave to spend a couple of years at the University of Connecticut. His ‘light bulb moment’ came shortly after arrival, when he sat down at a university computer terminal and fired off an email to a colleague at the University of Illinois in Champaign, Nevada, which was in turn connected to the ARPANET. “I wanted to set up a time for us to get together and a reply came back within a minute. We sent two or three emails back and forward and I thought, ‘if we had this kind of network connection it would change the way New Zealand research works.’”
At the time the standard way to communicate with overseas colleagues was by letter, which took at least a week to arrive. A response would not be expected inside a month. “I could see this reducing from months to minutes.” Despite his enthusiasm it still took two and a half years from his return to New Zealand to convince those in power to allow his department official access to international email.
“When they created my position as head of the Computer Science Department in 1984[18] they had allocated $5000 for equipment, but when I got here I discovered it had all been spent on a PDP-8. You couldn’t really complain because that’s what we used to do the networking,” said Hine. The PDP-8 was the first successful commercial mini computer, produced by DEC. Launched in 1965 it was the first widely sold computer in the DEC PDP series, with advances in technology including I/O (input/output), software development, and operating system design. The earliest PDP-8 model (the so-called ‘Straight-8’) used discrete transistor technology, packaged on ‘flip chip’ cards and was approximately the size of a compact refrigerator.[19]
With a network now in place across Victoria University campus in Wellington and a link with Auckland University finally connected in 1985, using the old 4800kbit/sec modems from the earlier KiwiNet experiment, and a leased line from the Post Office, it was time to expand.
Two meetings were held in 1985, attended by representatives of the New Zealand universities, DSIR, and the Ministry of Agriculture and Fisheries (MAF). The first official gathering, during the Telecommunications Users Association (TUANZ) seminar on 25–26 August, was essentially a working lunch to discuss university networking but resulted in a series of follow-up discussions over the next week. Although the ad hoc group of university computer and communications luminaries had invited government departments to become involved, no interest was shown.
John Hine presented his own research and kept a record of discussions, which centred on the options available for a national university network, the likely costs of doing this via Pacnet or leased line and the various protocol and standards issues that made the process rather complex. He warned there wasn’t much of a future for ether Bitnet or Coloured Book protocols, that international standards for the UK’s Janet based around Coloured Book could still be ten to 15 years away, and that TCP/IP, specified by NSF for ScienceNet in the United States, was likely to be around for a while.[20]
The second meeting, a one-day event in November 1985, was called to co-ordinate the development of a national research network. The only agreement reached was on a common protocol, the Coloured Book suite, which was heavily influenced by the development of Spearnet in Australia and used by Janet. Spearnet was originally conceived in 1984 but was not officially announced until a seminar in March 1986 after financial backing had been cemented with DEC. New Zealand jumped the gun, and while the DSIR and MAF had attended the original meeting and agreed on the use of the Coloured Book protocols, neither joined Spearnet. The DSIR had its own experimental implementation, increasingly dominated by DEC VAXs. In 1985 an X.25 interface was added to allow DECnet to share the network bandwidth with existing network traffic.[21]
MAF had its own connections to government computing centres, including Fisheries Research and Victoria University. Its private X.25 network connected DEC VAXs using DECnet, Prime mini-computers on PrimeNet protocols, and a Pyramidst machine using TCP. Since each system had a different transport level protocol, the only service available was remote log-in. MAF was gradually adding more users to its internal network, including LANs, and in 1987 hadn’t yet joined the fledgling national universities network. MAF had decided non-academic licences for the software on its Prime computers were too expensive, so when it began to be adopted during 1986, Spearnet became solely a university network.
Hicks advocates Unix
In the early 1980s Roger Hicks was technical manager for Interactive Applications, a small Auckland software house that specialised in accounting systems on early CPM and MS-DOS computers. He and its technical director were asked to find an operating system that would get the company involved in multi-user systems. Networking was virtually non-existent at the time.
Hicks had heard about Unix and after discussions, ended up in a meeting with Keith Hopper, senior lecturer in computer languages and standards at Waikato University. Hopper recommended setting up a users group to discuss its role in business computing. An ad hoc committee was formed and a handful of people began meeting to discuss various computer problems. While many of the international groups such as Usenix were very techie, the resulting NZUSUG, formed in 1985, aligned itself with and gained strong support from the more business-focused UniForum. A conference was planned to determine the wider interest in Unix and its applications in academia, research, and business.
Hicks had been on a business trip to a Unix conference in the United States in 1985. Around 3000 people had been expected but 9000 turned up, and the hotel was bursting at the seams. It was there that he found support from the international UniForum group. It surprised everyone how many people from the business community had turned up at such a technically focused conference. “In fact there was even a formal complaint made to the organisers that some people were wearing suits. I ended up bulldozing my way in and getting introduced to the people at Bell Labs. I was told if I turned up at the door and asked them about sending someone to speak at our forthcoming conference in New Zealand they would look kindly on it. The fact was we had no money, we just said to companies like Bell, ‘Why don’t you send a speaker to New Zealand at an exorbitant cost to yourselves and we’ll show them a good time and waive any charges if they come to the conference.’”
Hicks’ original discussions at Bell Labs were with an Australian, who saw the invitation as an opportunity to head home for a trip. However just before the New Zealand conference started there was a phone call from Bell Labs: “Would it be all right if Ken Thompson comes as well? He doesn’t want to be speaker or part of the programme; he wants to keep a low profile.” Of course Hicks immediately agreed. Thompson was one of the founders of Unix, and behind its further development as a mainstream operating system. “Here was this guy who was used to being mobbed by techies everywhere, sleeping in the students’ accommodation at Waikato University and sitting in the back at the conference. Basically we just sat and drank Lion beer with him afterwards and people left him alone. After a couple of days he said to us, ‘I’ve never been to a Unix conference where I haven’t been swamped by people and haven’t been asked to speak.’ We explained that was the Kiwi style. He said he didn’t want to speak and we were fine with that. Thompson then turned around and said ‘Well, I would like to speak’ so we found a slot in the programme for him,” recalled Hicks.
“Once the significance of this guy, who probably would have earned a Nobel Prize for computing if there had been one, was explained to the academic staff, they were pleased to have him and wanted to meet him. Suddenly the university started to take notice of what we were doing.” Thompson returned at least twice to continue his newfound relationships in New Zealand.
Hicks, the second chairman of the Unix Users Group,[22] which quickly morphed into the local arm of UniForum, had the job of sharing information, building relationships within the IT community, and giving “techies a reason to get together and have fun.” But it wasn’t only the techies who turned up. Typically they would be surrounded by a group of marketing people who needed to understand the way they were thinking. “To sell a product or an idea it’s the technical people who have to validate it and say whether something could or couldn’t be done, and ultimately that’s the way it worked. At those conferences the vendors began to look at how they could implement their applications across different platforms.”
Initially there was strong resistance, particularly from larger vendors such as IBM, Digital, and Hewlett-Packard, who were only interested in their proprietary operating systems. Early Unix variants came in through specialised mini computers and workstations, including Sun Microsystems, NCR, Data General, Tandem, and Silicon Graphics. Hicks believes UniForum had a major impact on the adoption of Unix by local business.
He recalled setting up access to UUNET for a company in Auckland that wanted to exchange emails and swap software with its parent company, a supplier in the United Kingdom. “They had been sending floppy disks by post and it was taking a week or two to turn around. They asked me if there was an easier way of doing this. It wasn’t as robust as it should have been; there were problems with the software and operating system which meant the automatic dial-out every night didn’t work sometimes, but these were very early days,” said Hicks.
“Instead of username@machine.whatever the email address actually had the route specified in the email so you’d have machinename!machinename!machinename!username… and it would send it from each machine down the list. So my machine would send it to the first one on the list, which would rip its name off and send it to the second one on the list and so on. So this was called bang-path addressing which was used before there was any routing,” he recalled.
Waikato joins CSnet
At Waikato University John Houlker had begun to show an interest in how the country’s universities could make better use of local and international network resources. “One of my early tasks was looking at the campus connections and weighing up what we should do about a local area network. This was at a time when Ethernet was emerging and there was controversy about whether Token Ring[23] was going to become the standard. One view was that since IBM was behind Token Ring it would ultimately swamp what everyone else was doing. Some universities were hesitating but the evidence I had was that Ethernet looked pretty good and so I was involved in establishing the early Ethernet around the campus,” said Houlker.
After his early involvement using Ethernet and X.25 communications on campus, he moved from part-time-programmer-full-time-student to full-time programmer and staff member. PCs were still on the horizon and everything was focused on centralised computing. A major step towards improving communications speed was the replacing of Ethernet trunks around the campus with glass fibre. “That was driven by some major problems we had with lightning which was striking the copper Ethernet trunks and blowing up equipment, causing it to short out all over the place and leading to great damage and delay.”
At the time there was no clear direction for university networking and no shortage of possible options. Auckland and Victoria universities wanted an academic research network with Bitnet protocols in addition to the Unix-Unix protocols. The ISO-OSI Coloured Book argument continued, and there was strong interest in what was happening at universities in Australia, the United Kingdom with the EARN (European Advanced Research Network) and the Unix-to-Unix and TCP protocols being used in the United States. Houlker and other pioneers of the time including John Hine and Neil James were intimately interested in what this meant for university networking as a whole, so they immersed themselves in the technical papers and in open discussion about the way forward.
On the shortlist were UUCP, RSCS protocols, Coloured Book, and TCP/IP plus other proprietary vendor specific approaches such as PrimeNet and DECnet. At the time DEC was claiming its network would merge with ISO-OSI. In fact it was all getting rather complex, with universities having to co-ordinate various protocols for different networks and services across various departments. A Reuters ‘Business Briefing’ information service required its own leased line and another line for the New Zealand Press Association (NZPA). The National Library had its IBM SNA network to access bibliographic records, which required separate connections into each university. Houlker took it in his stride, aware that being up to date on all the options was part of the job.
He had been keeping a close eye on innovations coming from ARPA, particularly from 1986 when the NSF made a huge grant and created the NSF backbone (NSFnet). Concurrently the UK-based Janet network had taken off, and the NZPO was starting to widely deploy X.25 packet switching over its Pacnet public data network. The advice from the so-called experts at NZPO was that X.25 would become its dominant platform for data networking and the price was destined to drop and become even more economical than leased lines. Waikato and several other universities were already using Pacnet and several were using proprietary DEC software over X.25. “The cost side of it definitely looked bad. We were paying for all these different services and specific circuits and beginning to wonder whether this was such a great idea. While Telecom was insisting X.25 was the way to go, and that it would become a much cheaper route, the software for running networking over X.25 was thin on the ground and expensive, especially for TCP/IP.”
Houlker, Hine, and other senior people in the university networking arena had a sense this TCP/IP stuff was probably more important than they realised but running it on X.25 was problematic. In fact X.25 was hardly used in United States. “What we were witnessing was more or less a transatlantic divide. On one side in the US you had Department of Defense-led projects on TCP/IPNetworking that used raw data circuits and route around or avoid specific telecommunications carriers. With this approach you could access unused capacity from multiple telcos and build a network on top of it. On the other side of the Atlantic, in the United Kingdom at the Rutherford-Appleton Laboratories and other locations, British Telecom was backing a packet data network structure which had very defined customer interfaces. That’s what X.25 is; it’s not a specification for a packet routing protocol, it’s a specification for the service interface, around the idea that you’d have to have a telco service provider and then a data network customer,” Houlker explained.
It was clear the NZPO was loyally following the United Kingdom and Europe, believing X.25, Coloured Book, and OSI stack, would all become part of a ratified international standard. Houlker began speaking to his peers in the United Kingdom and managed to get a good deal on the software needed to get a network up and running. He even brought out an expert to help build a gateway to translate email from the Coloured Book protocols to the other networks being used at Waikato. “We had some hard decisions to make and with some reluctance went down the Coloured Book route thinking this worked best with an X.25 national network.” In Australia similar approaches were made so there would at least be an easier connection with universities there and in the United Kingdom.
Canterbury University was still dialling up Canada and Melbourne. Victoria was using the Unix UUCP mail and newsgroup utilities to connect to Sydney and to Canterbury. While there had been various, often clumsy, gateways into and out of the country, some dating back to the 1970s, none qualified the user for an Internet domain name. Not one to accept a single solution to a complex problem, Houlker began looking more closely at the US Computer Science Network (CSnet). In late 1985 he had discussions with CSnet founder and Internet hero Larry LandWeber, at the University of Wisconsin, a key node on the NSFnet backbone. One of the briefs of LandWeber’s CSnet International Liaison group was to help universities who couldn’t afford to connect to NSFnet, including international participants. A Unix gateway onto the backbone had been established in Maryland, and there was a big debate about whether commercial organisations should be allowed access or whether the Department of Defense should partition off some of the Unix-to-Unix stuff that was in the private sector.[24]
“CSnet was kind of a royal road for us into the Internet domain. They did go to some lengths over a couple of months to check who you were and what you were doing. Larry LandWeber was held in some esteem, so I imagine once he had approved us, Jon Postel at IANA (Internet Assigned Numbers Authority) would have accepted we were a trustworthy organisation,” said Houlker. An email was sent confirming the arrangement with Waikato University as the dot.nz country code domain holder. Technically the New Zealand top-level domain dot.nz, largely proposed through the Janet and OSI networking regimes, was assumed by the New Zealand vice chancellor’s Standing Committee on Computing. When Houlker struck the arrangement with CSnet, the country code effectively fell to him as gateway manager at Waikato University. “We weren’t directly connected to the Internet or using TCP/IP. We were running special CSnet software for dial-up email; in fact we used a Pascal version of MMDS (multimedia distribution facility) that was a little easier to use and more flexible than the UUCP software.” CSnet ran the DNS for the various zones including dot.nz, and mapping the MMDS protocol to IANA’s DNS.
The DSIR’s AMD had earlier applied for a link to CSnet but was rejected as it would only connect to universities. It had to wait for Waikato to get its connection up and running, as did a number of universities keen to share the connection. However there was a frustrating wait over several months before problems were ironed out.
Houlker was trying to make the most of the opportunity and keep costs down. He modified the software to run on X.25 rather than ordinary dial-up so it would be easier to manage and cheaper for international voice calling. “I certainly had a tough time being so far away from the University of Wisconsin, which had some of its managers based in Boston. It took two months just to change a password. Their set-up was wrong and it took a long time to convince them to change it. They were claiming my changes to the software were faulty and I felt that they had just misconfigured some stuff at their end. Eventually we got it going.”
Battle of the protocols
While the use of the Coloured Book protocols gave the universities immediate access to Janet, Spearnet, and related networks, this limited any further adventuring into cyberspace.[25] Spearnet failed to develop as hoped, largely because the successful Australian-based ACSnet (Australian Computer Science Network) was already delivering many of the functions users were seeking.
To gain access to a larger international community, Victoria University had initiated connections between its Unix systems, including a DEC VAX11-780, which was participating in the X.25-based Spearnet, to connect to other DEC machines. It managed to link the VAX to the IBM using the same techniques for connecting with the IBM mainframe at Auckland, and in the process discovered its in-house-developed email worked across both. It also connected to the Unix networks at Melbourne University and with the University of Calgary in North America. Ultimately it achieved a link with the main UUCP hub on the West Coast of the United States. This provided access to electronic mail and Usenet. Some of the hosts on the network used VAX/VMS machines with connections to Telecom’s Pacnet and accessed Digital’s PSI mail to communicate with similar sites.
Victoria University technician Mark Davies had a huge influence on the way things developed, building the Computer Science Department’s network and the software and systems that connected the academics through the campus-wide network. Davies had a computer science undergraduate degree from Victoria and had never left. “In computer science there’s a strong interest in connecting things and trying to get past the tyranny of distance to find out what’s going on in the rest of the world. I was keen to grow my knowledge and having access to Usenet meant having competent, intelligent people to talk to about interesting things,” he said.
One of Davies’s interests was accessing as much free software as he could find and making enhancements before firing it back to the broader user community, who were then able to make further enhancements. He was a heavy user of the emacs editor,[26] and had a Lisp interpreter,[27] which he used to add features. For example, the pre-eminent mail distribution package at the time, Send Mail, contains code written by Davies as does Bind.[28] “I would find the limitations and fix it. In those days we would take an entire Usenet dump every couple of months so we got ‘a dose of the world at a time,’ delivered on half-inch disks.”
With a connection to Sydney, Melbourne, and Canada and access to international email and Usenet, Davies and others in the Computer Science Department had a sense they were tuning in to the wider world, even if it was through three-inch magnetic disks or email that took days to be delivered through multiple computers. He’d kept a close eye on the Berkeley Unix developments in dial-up networking. “We knew the Internet was out there – essentially ARPANET in the United States with links into Europe – but the high cost of international communications was prohibitive and it seemed as though it wasn’t going to get to New Zealand in our lifetime. So we connected our own little IPNetworks over Ethernet through Unix systems.”
The local Unix group had contributed a huge hard disk on permanent loan to Victoria University to handle the store-and-forward communications with Melbourne University. In turn Victoria University, which was beginning to have much closer contact with the business community, turned up at the Uniforum conferences and took a commercial booth. “We were consulting in a way, although it was more like preaching. We were definitely passionate,” explained John Hine. By February 1986 Victoria began providing links to the DSIR and MAF networks. By mid-year Waikato, Massey, and Canterbury Universities had joined in, using the US links for mail and access to newsgroups.[29]
Tony McGregor, who was working at Massey’s computer centre at the time, said the first UUCP link out of Massey was to Auckland in 1986, using the leased voice circuit put in place to reduce the cost of toll calls. “We got access to the circuit in the evenings. I think we also, briefly, had a link to Victoria over a similar circuit. It took me months to organise because of a senior manager who said: ‘There will never be computers connected to my telephone exchange.’ Our victory was short lived because he imposed daytime toll rate charges on us and effectively killed the project. We made the DSIR link a few months later.” The Massey node was a 68000-based Unix Altos computer.[30]
A year later the AMD of the DSIR in Wellington and Auckland, Fisheries Research in Wellington, and ICL Computers in Wellington were on the Unix network. By February 1987 every university except Canterbury had implemented the Janet Coloured Book protocols with electronic mail based on the Grey Book mail protocol. Blue Book was for file transfer and Yellow Book was the transport layer with the carrier layers running over the Post Office Pacnet X.25 network. Massey University implemented the network on its Prime computers while other universities had implemented the protocols on Digital VAX/VMS systems. Canterbury ordered a VAX/VMS so it could join the network by mid-1987.
Gateway to many paths
As a compromise, or simply because they could, the mail gateway at Victoria had now expanded to embrace the potpourri of protocols – Spearnet, ACSnet, DSIRnet – which now linked most of its DSIR sites around the country and the Unix network. Waikato University created a mail gateway between Spearnet and CSnet. Victoria’s now comprehensive mail gateway service meant it became the first Internet service provider (ISP) in the country, with Professor John Hine and his team the first to introduce a commercial dial-up service.
“We went to the university administration and said we wanted to open up an account for a self-funding project which involved email and UUCP news. They looked at us and didn’t understand any of it. We had dial-up links and leased lines at 4.8kbit/sec and 9.6kbit/sec, and international charges were $25 per megabyte, but at only 5 cents for a two kilobit email that was still a lot cheaper than the Post Office could offer with snail mail.” Hine said the cost of sending electronic mail within New Zealand was around 10 cents per A4 page; to Australia it was 60 cents and to the rest of the world, $1.20. With no funding available for accessing international research networks, even though it was most often used for legitimate research purposes, Victoria’s telephone bill with the Post Office began to escalate. “At one point we had racked up $16,000 in the red.”
A plan for recovery of costs was put in place but Victoria University had broken the drought. The innovation and determination of the technical gurus at its Computer Sciences and Computer Services Departments had pushed the boundaries where no one had connected before. As well as providing Internet-like connections to the outside academic and research world, and the first major links between universities and research groups locally, Victoria had five organisations in Wellington, including the university itself, the DSIR, and a couple of commercial businesses, prepared to pay for email.
Frank March had applied for the job as director of Computer Services Department at Victoria in 1987, inspired by a report from Professor of Classics, Chris Dearden, who foresaw the pervasive use of desktop computing and how this was about to change life, particularly in the university environment.[31] “It was quite a far-sighted and daring idea at that stage to suggest you might have a computing strategy which resulted in a computer on every desktop in the university. I wanted to be part of this,” said March.
Email was the killer application in the DSIR but at Victoria University it turned out to beconnecting desktops to the library. Academics were very keen to find out if specific books were still on the library shelves before setting off on foot to discover someone else had them on loan. The library catalogue was an early on-line service and, according to Hine, did more to speed up networking on campus than anything else further, fuelling the idea of a computer on every desk. “The important part of the vision was that you could be on your desktop and see the local network and the rest of the world, through access to the Internet.”
Desktop revolution predicted
Professor Dearden’s report was based in part on a two-hour seminar to the university entitled ‘A Computer on Every Desk?,’ a 32-page discussion paper and questionnaire, and two days of discussions with key university and industry people. It proposed that the university invest in desktop PCs and local and international networking capability as soon as practically possible.
At the time of the report the cost of a workstation, software, and networking was likely to be $10,000 each,[32] with an overall cost of about a million dollars.
Sort out this mess
Professor John Hine’s 1987 technical report, ‘Research Networks in New Zealand,’ published by the Department of Computer Science at Victoria University and partly funded by the AMD of the DSIR, was the perfect follow-up document. While Dearden had taken a visionary approach predicting the growth curve of desktop computing right through to the need for ergonomic considerations, Hine’s report got down to the nitty-gritty of networking.
His disclosures about the painful protocol wars and the thankless task of cobbling together network solutions without any central direction would be a revelation to the wider community, revealing what he and other pioneers in the search for academic connectivity had been through. It also delivered a strong warning about the obstacles ahead.
Released in the summer of 1986–1987, Hine’s report set out to identify weaknesses in current developments and assist in planning. “At Victoria we quickly learned the importance of being a member of this international community. It was our view that the ambition of a national university network was not sufficient, we should be aiming for a research Internet, connecting academic, government and commercial organisations,” said John Hine.
Coloured Book protocols allowed the universities to construct a heterogeneous network during 1986 running on DEC and Prime equipment, but by the end of the year most had moved to DEC equipment for communications, ironically negating one of Coloured Book’s major advantages. Problems had also arisen with creating gateways to LANs. The Yellow Book transport protocol assumed the underlying X.25 network could establish an end-to-end connection but made no provision for internetworking. In the absence of X.25 LANs on campuses, it was impossible to reach machines not connected to the Post Office’s Pacnet.
There was no provision for a name service in the Yellow Book definition, which required each site to keep an up-to-date list of all other sites they might wish to communicate with. Consequently the Grey Book mail implementations were only capable of delivering mail to machines connected to Pacnet, with no way for other machines to get at the addressing information. Thus it was impossible to build a mail relay to interface with campus LANs. While Victoria had created its own relay system, it sometimes caused multiple copies of a message to be sent due to the lack of information.[33]
By 1987 a growing number of research organisations had well-developed networks with increasing workstation-to-staff ratios, giving easier and more frequent access to network services. The interconnection of those networks had developed in an ad hoc manner and was nowhere near as well established as Australia, the United States, and the United Kingdom. Connection to overseas networks was not widespread.
While many organisations such as the New Zealand Bibliographic Network (NZBN) and INFOS, which offered statistical information about New Zealand, had plans to establish information sources, the ability to communicate with users on other networks was still new. The rules about how to access these services were varied and confusing for users.
Hine concluded that while the country was in the early stages of establishing its own Internet to offer electronic mail services based on DSIRnet, Spearnet, and a Unix network, the DSIR and MAF were headed in a different direction with their DEC-based networks. The dilemma was whether to drop Coloured Book in favour of the emerging TCP/IP protocols already in use at some campuses and becoming available off-the-shelf at relatively low cost. While the favoured British OSI approach was embracing elements of the Coloured Book protocols and gaining ground in New Zealand, there remained widespread confusion about the way forward.
Why would New Zealand researchers, having had a taste of the connected world, not want continued access to computing resources in North America? The growing reliance on TCP/IP protocols would require New Zealand to make a committed change. Hine suggested it was important for research networks in New Zealand to reach agreement on a smooth, reliable, ubiquitous nationwide network for exchanging email. He knew then what many later would take for granted: “It is only a short time from experimenting with electronic mail to becoming dependent on it.” It wasn’t sufficient for some machine somewhere in the organisation to offer electronic mail.
The evolution of additional services and enhancements would be pivotal to the way any national network was developed.
Unlike many other countries, New Zealand did not have any national body with overall responsibility for research and development.[34] In the absence of a centralised body or any funding, an ad hoc group had taken responsibility for establishing a research network in New Zealand. There was no operating budget, and it was unable to make and implement decisions. It could only recommend action based on what should be done and hope for follow-through. Consequently various universities set up administrative procedures to charge other universities for the communications costs of gateways and other services. Hine said this added unnecessary cost to the development of the Internet locally and discouraged its use by the research community.
One option was for the DSIR and the University Grants Committee to jointly fund a group to administer and develop the network, but this would result in the university (Victoria) and DSIR supporting the remainder of the community. Hine concluded that “every effort should be made to find some basis for continual funding of New Zealand’s research networks and the Network Information Centre.”
New Zealand needed to act quickly to establish addressing domains and standardised naming conventions to ensure electronic mail got to and from the desired destination.[35] The ISO had assigned a standard two-letter abbreviation for each country and everyone but the US had adopted these as the top-level domains. The New Zealand top-level domain was dot.nz and responsibility for managing was assumed by the New Zealand vice chancellor’s Standing Committee on Computing. At a February 1987 meeting it was decided New Zealand would follow international precedents and have three second-level domains. The .ac domain or academic domain would cover universities, polytechnics, and schools. The .govt domain would cover all government departments and a third domain (.co.nz) would cover all commercial organisations. At the time no organisation had assumed responsibility for the commercial sector and a commercial domain was not in existence.
Hine urged the private sector to become involved in the development of this network to help foster collaboration in research activities. He said establishing support for domain naming and addressing should receive top priority as it would “facilitate international connections and ease of use for the general research community.” He reiterated his concern at the lack of consistent funding for the development of a national research network. “While I have no doubt that the payoffs are so great that the network will evolve, I fear the evolution will be frustratingly slow because of a lack of commitment of funds. This remains the biggest obstacle to the development of a comprehensive research network.”[36]
Early warnings ignored
The lack of a central body to co-ordinate research and development a national network had been highlighted in the 1986 report of a ministerial working party on science and technology, ‘Key to Prosperity: Science and Technology,’ also known as the Beattie Report. It had concluded that New Zealand should double its overall R&D spending in the public and private sector by the 1993–1994 year. The report stated that if New Zealand was to improve its ‘knowledge, innovation, and productivity’ and keep pace with other nations it needed to shift from labour-based to knowledge-based industries with an adequately trained and informed workforce, and ensure ‘managers and boardrooms’ were more aware of the potential of R&D to facilitate innovation.
“We are convinced that New Zealand’s overall present performance in all three aspects is less than adequate to achieve a significant rate of real growth. Market forces cannot be expected, unaided, to influence these important factors sufficiently to allow New Zealand to hold its own against competition, let alone do better,” the report said. While accepting the principal of user pays and that “the beneficiaries of particular research should be identified wherever possible and expected to pay,” the report noted that the principles of the market philosophy, when applied to science, can fail. “For these reasons, a significant part of a country’s wider research and development has to be recognised by the government as a public good and be supported by public funds.”
The working party recommended the establishment of a minister of science and technology, a cabinet committee on science and technology, and a science and technology advisory board to respectively enhance the profile of science and technology; develop science policy; and advise the government on R&D issues. At the science management level, the establishment of a Science and Technology Council was recommended, to allocate and fund basic and applied research through existing government departments, universities, and research associations.
The Beattie Report was ignored by the government, which promptly established another committee to review science and technology.
This would overcome the management restrictions implicit in the Public Finance Act, which measured expenditure rather than output or outcomes, and enable government research institutions to engage in contractual arrangements only possible in the commercial sector. This, it was argued, would encourage private investment in science and technology.
It was clear that the visionary approach taken by the pioneers within the computer science departments of the various universities, the evidence of numerous reports into the inevitable growth of the use of technology, the need for universities to connect to each other, and the importance of a standardised and co-ordinated approach to wide area and international networking was falling on deaf ears. Persistence and funding for the best ideas were imperative in order to avoid atrophy or worse, uninformed decision making.
Sidebars
Footnotes
[1] Bitnet came to mean ‘Because It’s Time Network,’ although the original meaning was ‘Because It’s There Network’
[2] At its peak around 1991, Bitnet extended to around 500 organisations and 3,000 nodes within educational institutions. It spanned North America, Europe and some Persian Gulf states but as TCP/IP systems reached maturity and the Internet went mainstream in the early 1990s its popularity rapidly diminished. en.wikipedia.org/wiki/BITNET
[3] This increased to a 2Mbit/sec backbone with 64kbit/sec access links in the mid-1980s
[4] Paraphrased from http://en.wikipedia.org/wiki/JANET
[5] Distilled from A Brief History of the Internet by Barry M. Leiner, Vinton G. Cerf, David D. Clark, Robert E. Kahn, Leonard Kleinrock, Daniel C. Lynch, Jon Postel, Larry G. Roberts, Stephen Wolff: http://www.isoc.org/Internet/history/brief.shtml
[6] This was upgraded to 1.5Mbit/sec by 1988
[7] When the strain of the burgeoning ARPANET began stretching the resources provided voluntarily by the larger UUCP hubs in the United States, Rick Adams, a systems administrator at the Centre for Seismic Studies, began looking at a way for commercial organisations to alleviate the burden. UUNET Communications Services began operation in 1987 as a non-profit corporation providing Usenet feeds, email exchange and access to a large repository of software source code and related information. It shed its non-profit status within two years and changed its name to UUNET Technologies. From 2006 UUNET became an internal brand of Verizon business (formerly MCI). Source: http://en.wikipedia.org/wiki/UUNET
[8] Edited from the research of Richard T. Griffiths, Leiden University: http://www.let.leidenuniv.nl/history/ivh/chap2.htm
[9] NWG, RFC 722, Jack Haverty (MIT), September 1976
[10] REXX (REstructured eXtended eXecutor), an interpreted programming language was developed at IBM. It was designed to be both easy to learn and easy to read
[11] The company best known for its work on packet switching, ARPANET and the Internet was acquired by GTE in 1998, then GTE and Bell Labs merged to become Verizon in 2000. BBN has been in private hands since 2006. en.wikipedia.org/wiki/Bolt,_Beranek_and_Newman
[12] Robert Biddle, formerly of Canterbury University Computer Science Department, in a posting to the nz.general newsgroup, 19 April 1994
[13] Biddle chose Cantaur because he expected ‘canterbury’ would one day become a domain name: “Cantuar was also the official abbreviation for the Latin name the Commonwealth Universities Office used for the university, cantuariensis. If you had a BSc from there, you would write J. Bloggs, BSc (Cantuar). A few years later they did away with Latin and made the abbrev ‘cant’ instead. I was disappointed when I got flamed from many places for having a site name that people though was based on centaur spelled wrong.
[14] In Unix and other computer multitasking operating systems, a daemon is a computer program that runs in the background, rather than under the direct control of a user and is usually initiated as a process. The term was coined by the programmers of MIT’s Project MAC and derived from Maxwell’s daemon, an imaginary being from a famous thought experiment that constantly works in the background, sorting molecules. Daemons are also characters in Greek mythology, some of whom handled tasks that the gods couldn’t be bothered with. Source: http://en.wikipedia.org/wiki/Daemon_(computer_software)
[15] Robert Biddle
[16] Ibid
[17] Ibid
[18] In 2007, head of the School of Mathematics, Statistics and Computer Science.
[19] http://en.wikipedia.org/wiki/PDP-8
[20] Minutes of the 26 August 1985 meeting called to discuss university networking, taken by John Hine
[21] John H. Hine, Research Networks in New Zealand Technical Report CS-TR-87-021, 12 September 1997
[22] Brian Peace was the first chairman of NZUSUG
[23] An IBM topology for local area network connections in which all the computers are connected in a star or ring and use a bit or token-passing scheme to prevent the collision of data when messages are being sent concurrently. Token Ring was quickly surpassed by Ethernet on most networks as local area networking came of age
[24] The Unix gateway in Maryland was deemed an experiment that might last only a year but in fact it lasted forever, became Altanet, was subsequently bought out by MCI and then later by Worldcom and finally Verizon (Houlker)
[25] Cyberspace was a term coined by William Gibson in his 1984 novel Neuromancer, in which he described a futuristic computer network that people could plug their minds into. The term quickly became synonymous with the Internet or on-line world
[26] The emacs editor is arguably the editor of choice among many software engineers and referred to as the code cutter’s Swiss army knife
[27] Lisp, from ‘list processing,’ originally created as a practical mathematical notation for computer programs. It was one of the earliest programming languages, pioneering many ideas in computer science including tree data structures, automatic storage management dynamic typing, object oriented programming and the self hosting compiler. Linked lists are one of Lisp’s major data structures and Link source code itself is made up of lists. List programs can manipulate source code as a data structure. http://en.wikipedia.org/wiki/Lisp_%28programming_language%29
[28] A DNS software package for Unix/Linux machines. It contains a DNS server, API library and tools. Most of the DNS servers on the Internet run Bind
[29] Hine
[30] Tony McGregor, History of the Internet debate on NZNog newsgroups, March 2001
[31] Everyone knows a classicist is about as far away from computing as you can possibly get. What happened, says March, was as soon as he was appointed, somebody went and dumped a Macintosh on his desk, and he became a computing aficionado
[32] Report of the Committee on Information and Computer Services, November 1985, Professor Chris Deardon, 5.1
[33] Hine
[34] In the US DARPA funds the ARPANET. CSNET received five years of support from the US National Science Foundation (NSF). Bitnet received substantial support from IBM. Janet in Great Britain was funded by the Computer Board and Research Councils. Japan’s JUNET is funded by KDD International Telephone and Telegraph
[35] Domains act like your street address, giving the postal service an exact location for your letters and parcels or the telephone numbers than ensuring the calls get through to your house or business.
[36] Hine