The Immortality of Bits

In 2015, it is safe to say that the weird problem-solving mechanisms of SETI@home and kitten-picture sharing have become normal problem-solving mechanisms for all domains.


Today it seems strange to not apply networked distributed computing involving both neurons and silicon to any complex problem. The term social media is now unnecessary: Even when there are no humans involved, problem-solving on this planet-scale computer almost necessarily involves social mechanisms. Whatever the mix of humans, software and robots involved, solutions tend to involve the same “social” design elements: real-time information streams, dynamically evolving patterns of trust, fluid identities, rapidly negotiated collaborations, unexpected emergent problem decompositions, efficiently allocated intelligence, and frictionless financial transactions.

Each time a problem is solved using these elements, the networked world is strengthened.

As a result of this new and self-reinforcing normal in problem-solving, the technological foundation of our planet is evolving with extraordinary rapidity. The process is a branching, continuous one rather than the staged, sequential process suggested by labels like Web 2.0 and Web 3.01, which reflect an attempt to understand it in somewhat industrial terms. Some recently sprouted extensions and branches have already been identified and named: the Mobile Web, the Internet of Things (IoT), streaming media, Virtual Reality (VR), Augmented Reality (AR) and the blockchain. Others will no doubt emerge in profusion, further blurring the line between real and virtual.


Surprisingly, as a consequence of software eating the technology industry itself, the specifics of the hardware are not important in this evolution. Outside of the most demanding applications, data, code, and networking are all largely hardware-agnostic today.

The Internet Wayback Machine,2 developed by Brewster Kahle and Bruce Gilliat in 1996, has already preserved a history of the web across a few generations of hardware. While such efforts can sometimes seem woefully inadequate with respect to pastoralist visions of history preservation, it is important to recognize the enormity of the advance they represent over paper-based collective memories.

Crashing storage costs and continuously upgraded datacenter hardware allows corporations to indefinitely save all the data they generate. This is turning out to be cheaper than deciding what to do with it3 in real time, resulting in the Big Data approach to business. At a personal level, cloud-based services like Dropbox make your personal data trivial to move across computers.

Most code today, unlike fifty years ago, is in hardware-independent high-level programming languages rather than hardware-specific machine code. As a result of virtualization (technology that allows one piece of hardware to emulate another, a fringe technology until around 20004), most cloud-based software runs within virtual machines and “code containers” rather than directly on hardware. Containerization in shipping drove nearly a seven-fold increase5 in trade among industrialized nations over 20 years. Containerization of code is shaping up to be even more impactful in the economics of software.

Networks too, are defined primarily in software today. It is not just extremely high-level networks, such as the transient, disposable ones defined by hashtags, that exist in software. Low-level networking software can also persist across generations of switching equipment and different kinds of physical links, such as telephone lines, optic fiber cables and satellite links. Thanks to the emerging technology of software-defined networking (SDN), functions that used to be performed by network hardware are increasingly performed by software.

In other words, we don’t just live on a networked planet. We live on a planet networked by software, a distinction that makes all the difference. The software-networked planet is an entity that can exist in a continuous and coherent way despite continuous hardware churn, just as we humans experience a persistent identity, even though almost every atom in our bodies gets swapped out every few years.

This is a profound development. We are used to thinking of atoms as enduring and bits as transient and ephemeral, but in fact the reverse is more true today.


The emerging planetary computer has the capacity to retain an evolving identity and memory across evolutionary epochs in hardware, both silicon and neural. Like money and writing, software is only dependent on hardware in the short term, not in the long term. Like the US dollar or the plays of Shakespeare, software and software-enabled networks can persist through changes in physical technology.

By contrast it is challenging to preserve old hard technologies even in museums, let alone in working order as functional elements of society. When software eats hardware, however, we can physically or virtually recreate hardware as necessary, imbuing transient atoms with the permanence of bits.

For example, the Realeaux collection of 19th century engineering mechanisms, a priceless part of mechanical engineering heritage, is now available as a set of 3d printable models from Cornell University6 for students anywhere in the world to download, print and study. A higher-end example is NASA’s reverse engineering of 1970s-vintage Saturn V rocket engines.7 The complex project used structured light 3d scanning to reconstruct accurate computer models, which were then used to inform a modernized design. Such resurrection capabilities even extend to computing hardware itself. In 1997, using modern software tools, researchers at the University of Pennsylvania led by Jan Van Der Spiegel recreated ENIAC, the first modern electronic computer — in the form of an 8mm by 8mm chip.8

As a result of such capabilities, the very idea of hardware obsolescence is becoming obsolete. Rapid evolution does not preclude the persistence of the past in a world of digital abundance.

The potential in virtual and augmented reality is perhaps even higher, and the potential goes far beyond consumption devices like the Oculus VR, Magic Leap, Microsoft Hololens and the Leap 3d motion sensor. The more exciting story is that production capabilities are being democratized. In the early decades of prohibitively expensive CGI and motion capture technology, only big-budget Hollywood movies and video games could afford to create artificial realities. Today, with technologies like Microsoft’s Photosynth (which allows you to capture 3d imagery with smartphones), SketchUp, (a powerful and free 3d modeling tool), 3d Warehouse (a public repository of 3d virtual objects), Unity (a powerful game-design tool) and 3d scanning apps such as Trimensional, it is becoming possible for anyone to create living historical records and inhabitable fictions in the form of virtual environments. The Star Trek “holodeck” is almost here: our realities can stay digitally alive long after they are gone in the physical world.

These are more than cool toys. They are soft technological capabilities of enormous political significance. Software can preserve the past in the form of detailed, relivable  memories that go far beyond the written word. In 1964, only the “Big 3” network television crews had the ability to film the civil rights riots in America, making the establishment record of events the only one. A song inspired by the movement was appropriately titled This revolution will not be televised. In 1991, a lone witness with a personal camcorder videotaped the tragic beating of Rodney King, triggering the Los Angeles riots.

Fast-forwarding fifteen years, in 2014, smartphones  were capturing at least fragments of nearly every important development surrounding the death of Michael Brown in Ferguson, and thousands of video cameras were being deployed to challenge the perspectives offered by the major television channels. In a rare display of consensus, civil libertarians on both the right and left began demanding that all police officers and cars be equipped with cameras that cannot be turned off. Around the same time, the director of the FBI was reduced to conducting a media roadshow to attempt to stall the spread of cryptographic technologies capable of limiting government surveillance.

In just a year after the revelations of widespread surveillance by the NSA, the tables were already being turned.

It is only a matter of time before all participants in every event of importance will be able to record and share their experiences from their perspective as comprehensively as they want. These can then turn into collective, relivable, 3d memories that are much harder for any one party to manipulate in bad faith. History need no longer be written by past victors.

Even authoritarian states are finding that surveillance capabilities cut both ways in the networked world. During the 2014 #Occupy protests in Hong Kong for instance, drone imagery allowed news agencies to make independent estimates of crowd sizes,9 limiting the ability of the government to spin the story as a minor protest. Software was being used to record history from the air, even as it was being used to drive the action on the ground.

When software eats history this way, as it is happening, the ability to forget10 becomes a more important political, economic and cultural concern than the ability to remember.

When bits begin to dominate atoms, it no longer makes sense to think of virtual and physical worlds as separate, detached spheres of human existence. It no longer makes sense to think of machine and human spheres as distinct non-social and social spaces. When software eats the world, “social media,” including both human and machine elements, becomes the entire Internet. “The Internet” in turn becomes the entire world. And in this fusion of digital and physical, it is the digital that dominates.

The fallacious idea that the online world is separate from and subservient to the offline world (an idea called digital dualism, the basis for entertaining but deeply misleading movies such as Tron and The Matrix) yields to an understanding of the Internet as an alternative basis for experiencing all reality, including the old basis: geography.

Science fiction writer Bruce Sterling captured the idea of bits dominating atoms with his notion of “spimes” — enduring digital master objects that can be flexibly realized in different physical forms as the need arises. A book, for instance, is a spime rather than a paper object today, existing as a master digital copy that can evolve indefinitely, and persist beyond specific physical copies.


At a more abstract level, the idea of a “journey” becomes a spime that can be flexibly realized in many ways, through specific physical vehicles or telepresence technologies. A “television news show” becomes an abstract spime that might be realized through the medium of a regular television crew filming on location, an ordinary citizen livestreaming events she is witnessing, drone footage, or official surveillance footage obtained by activist hackers.

Spimes in fact capture the essential spirit of bricolage: turning ideas into reality using whatever is freely or cheaply available, instead of through dedicated resources controlled by authoritarian entities. This capability highlights the economic significance of bits dominating atoms. When the value of a physical resource is a function of how openly and intelligently it can be shared and used in conjunction with software, it becomes less contentious. In a world organized by atoms-over-bits logic, most resources are by definition what economists call rivalrous: if I have it, you don’t. Such captive resources are limited by the imagination and goals of one party. An example is a slice of the electromagnetic spectrum reserved for a television channel. Resources made intelligently open to all on the other hand, such as Twitter, are limited only by collective technical ingenuity. The rivalrousness of goods becomes a function of the the amount of software and imagination used to leverage them, individually or collectively.

When software eats the economy, the so-called “sharing economy” becomes the entire economy, and renting, rather than ownership, becomes the default logic driving consumption.

The fact that all this follows from “social” problem-solving mechanisms suggests that the very meaning of the word has changed. As sociologist Bruno Latour has argued, “social” is now about more than the human. It includes ideas and objects flexibly networked through software. Instead of being an externally injected alien element, technology and innovation become part of the definition of what it means to be social.

What we are living through today is a hardware and software upgrade for all of civilization. It is, in principle no different from buying a new smartphone and moving music, photos, files and contacts to it. And like a new smartphone, our new planet-scale hardware comes with powerful, but disorienting new capabilities. Capabilities that test our ability to adapt.

And of all the ways we are adapting, the single most important one is the adaptation in our problem-solving behaviors.

This is the second major subplot in our Tale of Two Computers. Wherever bits begin to dominate atoms, we solve problems differently. Instead of defining and pursuing goals we create and exploit luck.

Previous | Up | Next

[1] The temptation to understand the evolution of computing in terms of discrete stages dates back to the idea of generations in computing. The vacuum tube, mainframe, minicomputer and personal computer eras are usually identified as the first four generations. The scheme fell apart with the failure of the Japanese “fifth-generation” computing effort, devoted to AI, and the rise of networking as the sine qua non of computing.

[2] As of this writing, the archive contains over 435 billion webpages.

[3] This definition of Big Data is due to Geroge Dyson.

[4] In 1999, VMWare introduced the first successful virtualization of the x86 processor, which powers most laptops and servers. This paved the way for cloud computing. Today, nearly all software is “containerized” to run either on virtual machines that emulate raw hardware, or more specialized and lightweight containers such as Docker. Virtualization is now so advanced that the x86 processor can be emulated within a browser. Leading-edge technologies like the Bromium microvisor today allow virtual machines to be instantly created just to run a single command. Virtualization technology isn’t just of historical interest for preserving hardware history. It is a mission critical part of keeping software evolving smoothly.

[5] Daniel M. Bernhofen  et al, Estimating the Effects of the Container Revolution on World Trade, Feb 2013, CESifo Working Paper Series No. 4136.

[6] Cornell Kmoddl Collection

[7] How Nasa Brought the Monstrous F1 Moon Rocket Back to Life, Ars Technica, 2013.

[8] ENIAC on a Chip, PennPrintout, 1996.

[9] Drone Footage Reveals Massive Scale of Hong Kong Protests, Mashable, 2014.

[10] The EU and Argentina, for instance, have right to be forgotten laws.