Centre for Internet & Society

A holistic reflection on information networks and it’s regulatory framework is possible only when the medium-specific boundary that has often separated the Internet and Telecom networks begins to dissolve, to objectively reveal points of contention in the communication network where the dynamics of network security and privacy are at large – namely, within the historic role of the intermediary at data/signal switching and routing nodes.

It is unfair to contextualize the history of the Internet without looking at how analog information networks like cable and wireless telegraph and later, the telephone, almost coincidentally necessitated the invention of automated networks for remote machine control and peer-to- peer communication over the Internet that promised to drastically reduce intermediary overheads. While the whole world was fraught in patent wars over wired private networks, the first nodes of the ‘open’ internet were built in a two-week global meeting of computer scientists who were flown down to simply prepare for ‘a public exhibition’ of the ARPANET in 1971.

While India only received it’s first telephone in New Delhi late into the 20th century, “Telegraph Laws” to most of the Indian working class always remained an ominously urgent telegram that brought the news of a dear one who had taken seriously ill. And so, on a lateral note, it is apt to bring to light the life of one Mr Almond Brown Strowger, wherein the idea of an automatic telephone exchange was given birth to by the ‘business of death’.

The Automatic Telephone Exchange

Almond Strowger was an undertaker based in Missouri, in a town where there was yet another undertaker, who’s wife incidentally was an operator in the then manual telephone exchange. Strowger came to believe the reason he received fewer phone calls was that his business competitor’s wife ended up preferentially routing all callers seeking Strowger’s funeral services to her undertaker husband instead. Strowger conceived the initial idea in 1888 and patented ‘The Automatic Telephone Exchange’ in 1891. http://goo.gl/oieIJ

Popularly known as the ‘Strowger Switch’, the Step-by Step switch (SXS switch) consisted of two interfaces – One at the customer’s end that used telegraph keys (and later a rotary dial) to send a train of electric current pulses corresponding to the digits 0 -9 all the way to the exchange. The actual Strowger switch at the exchange, used an electromechanical device that could move vertically to select one of 10 contacts, and then rotated to select one of another 10 in each row – a total of 100 choices. Consequently was formed in 1892, the Strowger Automatic Telephone Exchange Company at Indiana with about 75 subscribers. Strowger later sold his patents for $10,000 in 1898 to the Automatic Electric Company, a competitor of Bell System’s Western Electric. His patents were eventually acquired by Bell systems for $2.5 million in 1916, showing just how much growth and investor interest the telephone industry had gained by then.

Switching Paradigms

The architecture of global communication was headed towards different ideals and directions. Most media historians contrast these methodologies into ‘circuit switching’ and ‘packet switching’, or a connection-oriented fault intolerant system on one hand and another connection-less fault tolerant protocol respectively, both of which were being developed concurrently. In reality however, a major driving factor were the stakeholders backing the infrastructure of the rapidly growing communication industry, who were looking for growing returns on their investments. And hence these parallel ramifications may also be looked at through the lens of closed proprietary and medium specific networks versus an open, shared, medium in-specific paradigm of information theory.

Circuit switching relied on an assured dedicated connection between 2 nodes, and was especially patronized by the industry that saw telecommunication as the latest fad in urban luxury (a key factor in the distinction of suburban areas as the affluent moved into urban areas that were ‘connected’ by telephone). Owners and manufacturers of the hardware infrastructure became the most significant stakeholders. The revenue model was based on the amount of time the network was used and hence was popular in analog voice telephone networks. The entire bandwidth of the channel was made available for the duration of the session along with a fixed delay between communicating nodes. Therefore, even if there was no information being transmitted during a session, the channel would not be made available to anyone else waiting to use it unless released by the previous party. Early telephone exchanges relied on manual labour to facilitate switching until the automated exchange came about.

Packet switching on the other hand, leaned towards the paradigm of shared bandwidth and resources, and more importantly approached communication with complete disregard to the medium of transmission, be it wired or wireless. Furthermore, it also disregarded the content, modality and form of communication with an objectified data-centric approach. Information to be transmitted was divided into structured “packets” or “capsules”. These packets were all ‘thrown’ into the shared network pool consisting of numerous other such packets, each with its own destination, to be carefully buffered, stored and forwarded by intermediary routers in the network. Apart from occasional packet loss, the time taken to send a message is indeterminate and is dependent on the overall traffic load on the network at any given time.


Plans forged on into the early 1960s towards the development of an open architecture to enable network communication between computer systems, culminating in the invention of the ‘interface message processor’ that promised to herald the coming of an era of packet switching by enabling the ARPANET (Advanced Research Projects Agency Network), the first wide area packet switched network – and precursor to the world wide web as we know it today.

While the Information Processing Techniques Office (IPTO) had previously contracted Larry Roberts who in 1965 developed the first packet switched network between two computers , the TX-2 at MIT with a Q-32 in California, a growing need was felt to have a centralized terminal with access to multiple sites that would enable any computer to connect to any site. The first IMP was commissioned to be built by the engineering firm BBN (Bolt, Beranek and Newman, a professor student trio from MIT).

(The very first Interface Message Processor by BBN: Courtesy: http://goo.gl/tvo8n)

By 1971, the four original nodes that connected the ARPANET (viz, UCLA, Stanford Research Institute, University of Utah and University of California at Santa Barbara) had expanded to 15 nodes, but the lack of a common host protocol meant that a full-scale implementation and adoption of the ARPANET was far from complete. The time had come to allow the public to engage with the promising future that the Internet held. What entailed was the organization of first public International Conference on Computer Communication (1972) (http://goo.gl/PFhtL) under the umbrella of the IEEE Computer Society at the Hilton Hotel, Washington D.C. In many ways the event was the original version of a modern day new media art ‘hackathon’ and involved about 50 computer scientists who were flown in from around the globe alongside the likes of Vint Cerf and Bob Metcalfe. The deadline of a public demonstration provided the much-needed impetus to drive the network to functional completion. Exhibits included a variety of networked applications like the famed dialogue between the ‘paranoid patient’ chatbot PARRY and doctor ELIZA, motion control of the LOGO ‘Turtle’ across the network and remote access of digital files that were printed on paper locally. A milestone in distributed packet switching had been achieved and the stage had been set to compete with the archaic paradigm of circuit switched networks, even as delegates from AT&T (incidentally one of the funders of the event) watched on with the hope that the demonstration would run into a fatal glitch.

Who Minds the Maxwell's Demon

It may not be boldly evident from the vast corpus of policy research surrounding the regulation of communication networks (be it the issues of network security, privacy, anonymity, surveillance or billing systems) that key-points in the control system where dynamics play at large, are at the interfacing nodes and data/signal switches at either transceiver nodes as well as intermediary nodes. This is further underlined by the historical fact that the invention of the automatic telephone exchange was fuelled by the necessity to ensure a paradigm of unbiased circuit switching within the context of a networked business.

Just a glimpse at the number of patents that directly or indirectly refer to the Automatic Telephone Exchange patent shall bring to light myriad applications that range from “Linking of Personal Information Management Data”, “Universal Data Aggregation”, “Flexible Billing Architecture”, ”Multiple Data Store Authentication” , “Managing User to User Contact using Inferred Presence Detection” to various paradigms surrounding distributed systems for cache defeat detection, most of which are part of PUSH technology services that manage networked smartphone applications from instant messaging to email access. Other proposed systems for spectrum management and dynamic bandwidth allocation, such as policy alternatives to spectrum auction that entail frequency hopping at the transmitter level shall invariably depend on a centralized automated intermediary who shall in theory have transparent access to data flow. The role of routing intermediaries with specialized access, poses many interesting questions with regards to policy issues that surround network privacy and security.

This brings us back to the seemingly comical reference that this article makes to a mysterious entity named the ‘Maxwell’s Demon’. A thought experiment proposed by James Clerk Maxwell, involved a chamber of gas molecules at equilibrium that was divided into two halves along with a ‘door’ controlled by the “Maxwell’s Demon”. The demon had the ability to ‘open’ the door to allow faster than average molecules to enter one side of the chamber while slower molecules ended up on the other side of the chamber, causing the former side to heat up while the other side gradually cooled down, thereby establishing a temperature difference without doing any work, and thus violating the 2nd Law of Thermodynamics.  The parallel drawn in this article between networked switching intermediaries and the Maxwell’s demon does not go beyond this simple functional similarity.

However for the ambitious reader, it maybe interesting to note that ever since the invention of digital computers, scientists have actively pursued the paradox of Maxwell’s demon to revisit physical fundamentals governing information theory and information processing, which has involved analyzing the thermodynamic costs of elementary information manipulation in digital circuits – A study that probably constantly engages Google as they pump water through steel tubes to cool their million servers.

We shall save all this for another day, but on yet another related note, everytime say an email sent to an invalid address bounces back to your inbox as a “Mailer Daemon”, let it be known that the “Daemon” in Operating System terminology that refers to an invisible background process that the user has no control over, infact directly owes it’s etymology to the paradox of ‘Maxwell’s Demon’.

Filed under:
The views and opinions expressed on this page are those of their individual authors. Unless the opposite is explicitly stated, or unless the opposite may be reasonably inferred, CIS does not subscribe to these views and opinions which belong to their individual authors. CIS does not accept any responsibility, legal or otherwise, for the views and opinions of these individual authors. For an official statement from CIS on a particular issue, please contact us directly.