A Cloudy History: Four Histories of Cloud Computing

Part 1 of Planetary-Scale Computation: An industry primer on the hyperscale CSP oligopoly (AWS/Azure/GCP):

  1. Let’s Get Physical, (Cyber)Physical!: Flows of Atoms, Flows of Electrons

  2. A Cloudy History: Four Histories of Cloud Computing

  3. Primer on the Economics of Cloud Computing

  4. Three-Body: Competitive Dynamics in the Hyperscale Oligopoly

    1. Initial Positions and Laws of [Competitive] Motion

    2. Mass and the Law of [Economic] Gravitation

    3. Velocity and the n-body problem

  5. The Telos of Planetary-Scale Computation: Ongoing and Future Developments


A Cloudy History

A Brief History of Cloud Computing (the marketing term) A Brief History of Cloud Computing (the idea) A Brief History of Cloud Computing (the business model) A Brief History of Cloud Computing (as we know it today)

Jean Jennings, Marlyn Wescoff, and Ruth Lichterman with the ENIAC computer, 1946.
Jean Jennings, Marlyn Wescoff, and Ruth Lichterman with the ENIAC computer, 1946.

The gap between the physical reality of the cloud, and what we can see of it, between the idea of the cloud and the name that we give it — “ cloud ” — is a rich site for analysis. While consumers typically imagine “the cloud” as a new digital technology that arrived in 2010 – 2011, with the introduction of products such as iCloud or Amazon Cloud Player, perhaps the most surprising thing about the cloud is how old it is. Seb Franklin has identified a 1922 design for predicting weather using a grid of “ computers ” (i.e., human mathematicians) connected by telegraphs. AT&T launched the “electronic ‘skyway’” — a series of microwave relay stations — in 1951, in conjunction with the first cross-country television network. And engineers at least as early as 1970 used the symbol of a cloud to represent any unspecifiable or unpredictable network, whether telephone network or Internet.

Tung-Hui Hu, A Prehistory of the Cloud


A Brief History of Cloud Computing (the marketing term)

The history of the Cloud is an unclear one ... cloudy, even (hahaha ok sorry). An article published by MIT Technology Review in 2011, titled Who Coined ‘Cloud Computing’?, traced the coinage of the term “cloud computing” back to a May 1997 US PTO trademark application and, by contacting the founder of the now-defunct startup that applied for the trademark (NetCentric), unearthed the story of the first known mention of the now ubiquitous phrase. The startup founder, Sean O’Sullivan, was in negotiations with Compaq regarding a potential $5 million investment into O’Sullivan’s business plan to have NetCentric’s software platform enable ISPs to “ISPs to implement and bill for dozens, and ultimately thousands, of “cloud computing-enabled applications,” according to the plan.”

In their plans, the duo predicted technology trends that would take more than a decade to unfold. Copies of NetCentric’s business plan contain an imaginary bill for “the total e-purchases” of one “George Favaloro,” including $18.50 for 37 minutes of video conferencing and $4.95 for 253 megabytes of Internet storage (as well as $3.95 to view a Mike Tyson fight).

George Favaloro was a Compaq marketing executive who “had recently been chosen to a lead a new Internet services group” at Compaq. Favoloro’s internal memo at Compaq, titled Internet Solutions Division Strategy for Cloud Computing, is dated November 14, 1996 and is ostensibly the earliest known mention of the phrase “cloud computing.”

“The emergence of the Internet is driving the migration of communication and collaboration applications into the Internet “cloud” (e.g., telephony, fax).”
“The emergence of the Internet is driving the migration of communication and collaboration applications into the Internet “cloud” (e.g., telephony, fax).”

While there’s some uncertainty about which of two men actually originated the phrase, “Both agree that ‘cloud computing’ was born as a marketing term.”

Ten years after O’Sullivan and Favoloro’s meetings in Compaq’s Houston office in 1996, Eric Schmidt (then CEO of Google at the time) would make the first public mention of “cloud” and “cloud computing” in a modern, still-relevant context (as in, not “telephony, fax”) at a 2006 industry conference:

Eric: What's interesting [now] is that there is an emergent new model, and you all are here because you are part of that new model. I don't think people have really understood how big this opportunity really is. It starts with the premise that the data services and architecture should be on servers. We call it cloud computing – they should be in a "cloud" somewhere. And that if you have the right kind of browser or the right kind of access, it doesn't matter whether you have a PC or a Mac or a mobile phone or a BlackBerry or what have you – or new devices still to be developed – you can get access to the cloud. There are a number of companies that have benefited from that. Obviously, Google, Yahoo!, eBay, Amazon come to mind. The computation and the data and so forth are in the servers. ... Eric: And so what's interesting is that the two – "cloud computing and advertising – go hand-in-hand. There is a new business model that's funding all of the software innovation to allow people to have platform choice, client choice, data architectures that are interesting, solutions that are new – and that's being driven by advertising.

...

Eric: I think, if you think about it, all of the companies in the search space are benefiting from this conversion I was talking about earlier, to this new cloud model where people are living in more and more online.

Despite these comments being the first well-known, modern uses of the term, it should be noted that Amazon Web Services, “Launched in July 2002”, had been in existence for a little more than four years prior to Schmidt’s interview, although it was only in 2006 that AWS launched S3 and EC2 (in July and August, respectively). A fuller account of seminal product releases in Cloud Computing can be found on Wikipedia.


A Brief History of Cloud Computing (the idea)

Pictured here: Two faithful attendants of a proto-Multivac. (Silicon Valley)
Pictured here: Two faithful attendants of a proto-Multivac. (Silicon Valley)

Alexander Adell and Bertram Lupov were two of the faithful attendants of Multivac. As well as any human beings could, they knew what lay behind the cold, clicking, flashing face — miles and miles of face — of that giant computer. They had at least a vague notion of the general plan of relays and circuits that had long since grown past the point where any single human could possibly have a firm grasp of the whole.

Multivac was self-adjusting and self-correcting. It had to be, for nothing human could adjust and correct it quickly enough or even adequately enough — so Adell and Lupov attended the monstrous giant only lightly and superficially, yet as well as any men could. They fed it data, adjusted questions to its needs and translated the answers that were issued. Certainly they, and all others like them, were fully entitled to share In the glory that was Multivac's.

Isaac Asimov, The Last Question (1956)

While the marketing phrase “cloud computing” ostensibly originated in Compaq’s offices in 1996, the idea of what we would recognize to be modern-day cloud computing goes back much further to at least 1956. Isaac Asimov’s prophetic articulation of a lineage of supercomputers in his short stories (most notably in his personal favorite story, The Last Question, first published in 1956) includes Multivac, a fictional supercomputer inspired by an actual general-purpose computer called UNIVAC and one of the earliest fictional conceptions (if not the earliest) of what can recognized to be a contemporary data center. While Asimov’s description of Multivac changes throughout the various stories that the fictional supercomputer is featured in, his description of a vast and inscrutable “self-adjusting and self-correcting” “giant computer” spanning “miles and miles” that “had long since grown past the point where any single human could possibly have a firm grasp of the whole” perfectly describes the modern data center. Asimov’s descriptions of Multivac in his other short stories fleshes out the cloud-like nature of his fictional computer.

From Franchise (1955):

However, we are plugged into Multivac right here by beam transmission. What multivac says can be interpreted here and what we say is beamed directly to Multivac, so in a sense we’re in its presence.

From All the Troubles of the World (1958):

Within reach of every human being was a Multivac station with circuits into which he could freely enter his own problems and questions without control or hindrance, and from which, in a matter of minutes, he could receive answers.

Of the early conceptions of fictional computers in literature, Asimov’s Multivac seems to be the clearest case of science fiction manifesting into reality, a tendency that shows no sign of stopping in our age of technological acceleration.

The idea of a centralized computing resource accessed at distance by multiple parties via terminal (i.e., “typewriters”, “remote console”) was finding grounding in reality around the same time as Asimov’s Multivac stories were being published. John McCarthy, a co-author on the document that coined the term “artificial intelligence” and dubbed one of the founding fathers of the field, was perhaps the first to suggest publicly the idea of utility computing in a speech given to celebrate MIT's centennial: that computer time-sharing technology might result in a future in which computing power and even specific applications could be sold through the utility business model (like water or electricity)” (Wikipedia).

McCarthy’s 1961 centennial lecture for MIT was titled Time-Sharing Computer Systems and was published in Management and the Computer of the Future (1962) (p.220-248):

McCarthy: I am going to discuss the important trend in computer design toward time-sharing commputer systems. By a time-sharing computer I shall mean one that interacts with many simultaneous users through a number of remote consoles. Such a system will look to each user like a large private computer.

Time Sharing

McCarthy: I should like to go on now to consider how the private computer can be achieved. It is done by time sharing a large computer. Each user has a console that is connected to the computer by a wired channel such as a telephone line. The consoles are of two kinds, one cheap and the other better but more expensive. The cheap console is simply an electric typewriter that is used for both input and output.

McCarthy’s contributions to the development of the concepts of utility computing and time-sharing computer systems helped give rise to the computer time-share industry that is the much neglected precursor to the modern cloud computing industry.


A Brief History of Cloud Computing (the business model)

By focusing on the time-shared user as an economic subject, we can understand many of the attitudes that structure present-day digital culture. For the irony is that though the word “time-sharing” went out of fashion with the advent of mini- and personal computers in the 1980s, the very same ideas have morphed into what seems to be the most modern of computing concepts: cloud computing. In cloud computing, time on expensive servers (whether storage space, computational power, software applications, and so on) can be rented as a service or utility, rather than paid for up front.

Tung-Hui Hu, A Prehistory of the Cloud

Computer time-sharing refers to “the sharing of a computing resource among many users at the same by means of multiprogramming and multi-tasking.” Prior to the rise of personal computing (called ”home computers” at the time) in the 1980s, time-sharing was the prominent computing model because it spread the cost of expensive mainframe computers across multiple terminal users who typically interacted with the system on an intermittent basis (i.e., Sit at terminal, compute some stuff, think and write for a few minutes, and compute some more stuff). Users literally shared the compute-time of a central CPU which would allocate computing resources across active users on the network (the network was usually a university campus or a corporate office).

From geeksforgeeks:

A time shared operating system uses CPU scheduling and multi-programming to provide each with a small portion of a shared computer at once. Each user has at least one separate program in memory. A program loaded into memory and executes, it performs a short period of time either before completion or to complete I/O.This short period of time during which user gets attention of CPU is known as time slice, time slot or quantum. It is typically of the order of 10 to 100 milliseconds.

From Economic Perspectives on the History of the Computer Time-Sharing Industry, 1965–1985 by Martin Campbell-Kelly, Daniel D. Garcia-Swartz:

Time-sharing developed in the mainframe era. A time-sharing system consisted of a large central computer to which many terminals were connected. One terminal served one user, providing a computing experience comparable to an early personal computer, at least 15 years before PCs were routinely available. At the heart of time-sharing was an operating system that divided the computer’s resources among users, so that each user had the illusion that he or she was the sole person on the machine. The market for time-sharing existed because it was the only means at that time of providing a personal computing experience at a reasonable cost.

The first, experimental time-sharing system—the Compatible Time Sharing System— was demonstrated at the Massachusetts Institute of Technology in November 1961.

Compatible Time-Sharing System (CTSS) was the first functioning time-sharing system and was demonstrated the same year as MIT’s centennial in 1961 — CTSS was borne of a project that McCarthy himself initiated at MIT in 1959. He predicted the rise of the time-share industry based on his idea of computing as a public utility in the way that the telephone system was (and still is) a public utility.

From Computing as a Public Utility:

McCarthy: In concluding I should like to say a word on management and the computer of the future. At present, computers are bought by individual companies or other institutions and are used only by the owning institution. If computers of the kind I have advocated become the computers of the future, then computation may someday be organized as a public utility, just as the telephone system is a public utility. We can envisage computing service companies whose subscribers are connected to them by telephone lines. Each subscriber needs to pay only for the capacity that he actually uses, but he has access to all programming languages characteristic of a very large system. ... McCarthy: The computing utility could become the basis for a new and important industry.

For two decades from the 60s through the 80s, the commercial computer timesharing industry rapidly rose as the time-share architecture model become popularized, before rapidly falling into obscure obsolescence as Moore’s Law improved both the cost and performance of semiconductors, enabling the emergence of smaller, more convenient, and more affordable PCs. While the existence of the computer time-share industry isn’t well-known and its connection to modern-day cloud computing (itself a nebulous ... even cloudy ... topic for most people) is seldom referenced, there’s a strong argument that this industry was the original Cloud industry.

From The NIST Definition of Cloud Computing (Sep 2011):

Commercial time-sharing services enabled “ubiquitous, convenient, on-demand network access of configurable computing resources” to extent that what was made available by the timesharing industry was considered ubiquitous, convenient, and configurable at the time — it was, and still is, a matter of degree. People were accessing big computers miles and miles away and paying for provisioned resources. Sounds like Cloud to me.

Again, from Economic Perspectives on the History of the Computer Time-Sharing Industry, 1965 — 1985:

Commercial timesharing services developed as part of a larger phenomenon, the so-called data processing service industry. This industry had several components. First, there was the industry’s so-called batch data processing component. Batch data processing services had been around roughly since 1955—companies received raw data from customers via mail or messenger, processed the data according to the customers’ requests, and then delivered the processed data through the same channels. Second, there was the industry’s online component. It developed rapidly in the 1960s in parallel with the progress of computer and communication technologies—here customers achieved access to computing power via communication lines and terminals rather than via mail and messenger. The remaining components of the data processing services industry included software (both programming services and products) and facilities management. Here, we are primarily concerned with the time-sharing component of the industry’s online services sector.

Cloud-based SaaS, anyone? Although Cloud Computing (the marketing term) didn’t exist until 1996, Cloud Computing (the business model) clearly existed as far back as the 1960s, even if it wasn’t called as such.


A Brief History of Cloud Computing (as we know it today)

From History of the cloud by Blesson Varghese
From History of the cloud by Blesson Varghese

From Stanford Engineering, Amazon Enters the Cloud Computing Business (2008)

Launched in July 2002, Amazon Web Services (AWS) allowed developers to outsource their online and application infrastructure needs at commodity prices. AWS included the following services:

  • Alexa Web Information Service: web information service (acquired in 1999)

  • Mechanical Turk: dividing work into many tasks for humans (2005)

  • Elastic Compute Cloud: computing platform (2006)

  • Simple Storage Service: storage platform (2006)

  • Simple Queue Service: web service for storing and queuing messages across the Internet (2007)

  • Flexible Payments Service: online payment platform (2007)

  • Simple DB: web service for running queries on structured data in real time (2007)

  • Persistent Storage: allows developers to earmark a storage volume online for people to save files in different file systems (2008)

In contrast to previous three brief “histories” of cloud computing, the history of cloud computing as we know it today is much more well-trodden territory by virtue of its relative recency and because the industry’s emergence enabled the rise of an Internet that accelerated the pace of the creation, collection, and organization of data. The most concise history of modern cloud computing goes something like this: Amazon started selling compute (EC2) and storage (S3) services in late 2006 to turn a underutilized balance sheet item (servers) into a revenue line item, Microsoft and Google launched competitors in 2008, and they’ve been competing ever since. Stanford Engineering’s 2008 AWS case study gives an account of Amazon’s entry into the cloud computing business from a perspective that is usesful given the uncertainty of AWS’s success at the time of case publication and both Blesson Varghese’s History of the Cloud and Wikipedia’s history section for Cloud Computing give comprehensive and detailed timelines that I would add no value to by simply refactoring and repeating.

Writing about the “history” of modern cloud computing in 2022 would be like writing about the history of oil in 1890 — some pieces are in place, sure, but a lot is about to happen (this is not to imply that any of the hyperscalers are under serious threat of antitrust action, despite recent rumblings at the FTC). Like oil, cloud computing has the potential to fundamentally transform the world as we know it. However unlike oil, an energy source whose obsolescence is a question not of if but when, there is no conceivable future in which cloud computing ceases to exist barring humanity-wide catastrophic crises (nuclear war, big ass meteor, etc.). Ask most anyone on Earth to articulate their conception of humanity in one hundred, one thousand, or one hundred thousand years and in nearly all non-extinction-based formulations of the future there will be better, faster, and more networked computers. Regardless of whether or not Amazon, Microsoft, or Google exist as corporate entities in 1,000 years or if even corporations themselves cease to exist as an institutional form by that point, if humanity is technologically advanced society in 3022 then cloud computing will exist in some [ostensibly more advanced] form.

Cloud computing’s seemingly inexorable nature is reminiscent of Mark Fisher’s concept of “cybernetic realism”, the older (Fisher coined cybernetic realism in his 1999 PhD thesis, “Flatline Constructs: Gothic Materialism and Cybernetic Theory-Fiction”) and lesser known cousin of Fisher’s concept of “capitalist realism.” Fisher described capitalist realism as “the widespread sense that not only is capitalism the only viable political and economic system, but also that it is now impossible even to imagine a coherent alternative to it” — I submit that “cloud computing realism” might be described as the widespread sense that not only is cloud computing the only viable infrastructural technology for organizing any future political and economic systems (from American democratic capitalism to Chinese market communism), but also that it is now impossible even to imagine a coherent alternative to it. Even Ursula Le Guin’s novel articulation of an anti-capitalist, anarchic society through her exposition of Anarres in The Dispossessed (1974) featured “computers that coordinated the administration of things, the division of labor, and the distribution of goods, and the central federatives of most of the work syndicates” (reminiscent of Chile’s actual Cybersyn project, undertaken from 1971 to 1973) in what is essentially an unnamed cloud computing system — computer realism subsumes capitalist realism. [read Fisher’s thoughts on Le Guin’s novel in an archived k-punk blog post (the blog itself is hosted through an obscure cloud services company called New Dream Network, LLC)]

While my embryonic articulation of “cloud computing realism” is wholly distinct from Fisher’s “cybernetic realism”, it will in fact be cloud computing (which itself has origins in science-fiction, as we’ve explored) that brings Fisher’s cybernetic realism into reality.

From Fisher’s Flatline Constructs (1999):

If Baudrillard’s theory-fctions of the three orders of simulacra must be taken seriously, which means: as realism about the hyperreal, or “cybernetic realism”, it is because they have realised that, in capitalism, fction is no longer merely representational but has invaded the Real to the point of constituting it.

And in a quote from Nvidia CEO Jensen Huang that would’ve made even Jean Baudrillard blush — from Nvidia’s GTC November 2021 Keynote:

[51:45] Jensen: Companies can build virtual factories and operate them with virtual robots in Omniverse. The virtual factories and robots are the digital twins of their physical replica. The physical version is the replica of the digital, since they're produced from the digital original.

As of their latest quarter, Nvidia generated a little more than 40% of their revenues from selling products and solutions to data center with around a 50/50 split between hyperscale customers and enterprise customers. Fisher’s cybernetic realism therefore goes hand-in-hand with cloud computing realism, MC Escher style.

M.C. Escher, Drawing Hands (1948)
M.C. Escher, Drawing Hands (1948)

While the bulk of the remainder of this five-part primer will be grounded in the materiality, economics, and competitive dynamics of the cloud computing industry, touching upon history only insofar as they provide context for a business and strategy oriented perspective, my hope is that the reader keeps these four histories in his/her mind with the recognition that the fourth history of is currently being written and developed. The history of cloud computing, as we know it today, is just beginning.


Resources

[N/A] Wikipedia: Cloud Computing [History] [1962] John McCarthy: Time-Sharing Computer Systems [1967] Paul Baran: The future computer utility [Aug ‘06] Search Engine Strategies Conference: Conversation with Eric Schmidt hosted by Danny Sullivan [Mar ‘08] IEEE: Economic Perspectives on the History of the Computer Time-Sharing Industry, 1965—1986 [Apr ‘08] Cloud Computing. Available at Amazon.com Today [May ‘08] Stanford Engineering: Amazon Enters the Cloud Computing Business [Apr ‘09] Edge: Lord of the Cloud [Jul ‘11] Dr. Rao Nemani: The Journey from Computer Time-Sharing to Cloud Computing: A Literature Review [Oct ‘11] Wired: Who Coined ‘Cloud Computing’? [Aug ‘15]: Tung Hui-Hu: A Prehistory of the Cloud [Mar ‘19] Blesson Varghese: History of the Cloud [Nov ‘19] ispsystem: A brief history of virtualization, or why do we divide something at all [Nov ‘20] Jerry Chen: The Evolution of Cloud


Subscribe to 0x125c
Receive the latest updates directly to your inbox.
Verification
This entry has been permanently stored onchain and signed by its creator.