Alex Wright





Buy from Amazon.com

GLUT: Mastering Information Through the Ages

Introduction

Ever since the Web first started to flicker across the world's computer screens, we have seen a bull market in hyperbole about the digital age. Visiting San Francisco at the height of the 1990s dot-com boom, Tom Wolfe noted the particular brand of euphoria then sweeping the city. Wolfe, who made his journalistic bones chronicling the psychedelic raptures of the city’s 1960s pranksters , spotted a similar strain of quasi-mystical fervor taking hold among the young acolytes of the digital revolution. “Much of the sublime lift came from something loftier than overnight IPO billions,” he wrote, “something verging on the spiritual.” Enthusiastic dot-commers “were doing more than simply developing computers and creating a new wonder medium, the Internet. Far more. The Force was with them. They were spinning a seamless web over all the earth.” In the Day-Glo pages of Wired and a host of also-ran New Economy magazines, the so-called digerati were pumping a rhetorical bubble no less inflated than the era’s IPO-fueled stock prices. Writer Steven Johnson compared the dawning age of software to a religious awakening, predicting that “the visual metaphors of interface design will eventually acquire a richness and profundity that rival those of Hinduism or Christianity.” Elsewhere, supercomputer pioneer Danny Hillis argued that the advent of the World Wide Web signaled an evolutionary event on par with the emergence of a new species: “We’re taking off,” he wrote. “We are not evolution’s ultimate product. There's something coming after us, and I imagine it is something wonderful. But we may never be able to comprehend it, any more than a caterpillar can imagine turning into a butterfly.” More recently, inventor and futurist Ray Kurzweil has gone so far as to suggest that we are undergoing a “technological change so rapid and profound it represents a rupture in the fabric of human history,” an event so momentous that it will trigger “the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light.” Could the arhats themselves have painted a more dazzling picture of enlightenment?

Mystical beliefs about technology are nothing new, of course. In 1938 H.G. Wells predicted that “the whole human memory can be, and probably in a short time will be, made accessible to every individual,” forming a so-called World Brain that would eventually give birth to a “widespread world intelligence conscious of itself.” Similar visions of an emerging planetary intelligence surfaced in the mid-twentieth century writings of the Catholic mystic Teilhard de Chardin, who foresaw the rise of an "extraordinary network of radio and television communication which already links us all in a sort of 'etherised' human consciousness." He also anticipated the significance of "those astonishing electronic computers which enhance the 'speed of thought' and pave the way for a revolution," a development he felt sure would spur the development of a new "nervous system for humanity” that would ultimately coalesce into “a single, organized, unbroken membrane over the earth." Teilhard believed that this burgeoning networked consciousness signaled a new stage in God’s evolutionary plan, in which human beings would coalesce into a new kind of social organism, complete with a nervous system and brain that would eventually spring to life of its own accord. Teilhard never published his writings during his lifetime – the Catholic Church forbade him from doing so – but his essays found an enthusiastic cult following among fellow Jesuits like Marshall McLuhan, who took Teilhard’s vision as a starting point for formulating his theory of the global village.

Today, the torch song of technological transcendentalism has passed from the visionary fringe into the cultural mainstream. Scarcely a day goes by without some hopeful dispatch about new Web applications, digital libraries, or munificent techno-capitalists spending billions to wire the developing world. Some apostles of digitization argue that the expanding global network will do more than just improve people’s lives; it will change the shape of human knowledge itself. Digital texts will supplant physical ones, books will mingle with blogs, and fusty old library catalogs will give way to the liberating pixie dust of Google searches. As the network sets information free from old physical shackles, people the world over will join in a technological great awakening.

Amid this gusher of cyber-optimism, a few dissidents have questioned the dark side of digitization: our fracturing attention spans, the threats to personal privacy, and the risks of creeping groupthink in a relentlessly networked world. “We may even now be in the first stages of a process of social collectivization that will over time all but vanquish the ideal of the isolated individual,” writes critic Sven Birkerts. In this dystopian view, the rise of digital media marks an era of information overload in which our shared cultural reference points will dissolve into a rising tide of digital cruft.

For all the barrels of ink and billions of pixels spent chronicling the rise of the Internet in recent years, surprisingly few writers seem disposed to look in any direction but forward. “Computer theory is currently so successful,” writes philosopher-programmer Werner Künzel, “that it has no use for its own history.” This relentless fixation on the future may have something to do with the inherent “forwardness” of computers, powered as they are by the logics of linear progression and lateral sequencing. The computer creates a teleology of forward progress that, as Birkerts puts it, “works against historical perception.”

My aim in writing this book is to resist the tug of mystical techno-futurism and approach the story of the information age by looking squarely backward. This is a story we are only beginning to understand. Like the narrator in Edward Abbott’s Flatland—a two-dimensional creature who wakes up one day to find himself living in a three-dimensional world—we are just starting to recognize the contours of a broader information ecology that has always surrounded us. Just as human beings had no concept of oral culture until they learned how to write, so the arrival of digital culture has given us a reference point for understanding the analog age. As McLuhan put it, “one thing about which fish are completely unaware is the water, since they have no anti-environment that would allow them to perceive the element they swim in.” From the vantage point of the digital age, we can approach the history of the information age in a new light. To do so requires stepping outside of traditional disciplinary constructs, however, in search of a new storyline.

In these pages I traverse a number of topics not usually brought together in one volume: evolutionary biology, cultural anthropology, mythology, monasticism, the history of printing, the scientific method, eighteenth-century taxonomies, Victorian librarianship, and the early history of computers, to name a few. No writer could ever hope to master all of these subjects. I am indebted to the many scholars whose work I have relied on in the course of researching this book. Whatever truth this book contains belongs to them; the mistakes are mine alone.

I am keenly aware of the possible objections to a book like this one. Academic historians tend to look skeptically at “meta” histories that go in search of long-term cultural trajectories. As a generalist, I knowingly run the risks of intellectual hubris, caprice, and dilettantism. But I have done my homework, and I expect to be judged by scholarly standards. This work is, nonetheless, fated to incompleteness. Like an ancient cartographer trying to draw a map of distant lands, I have probably made errors of omission and commission; I may have missed whole continents. But even the most egregious mistakes have their place in the process of discovery. And perhaps I can take a little solace in knowing that Linnaeus, the father of modern biology, was a devout believer in unicorns.

> Read the reviews

Inside the Book