THIS STORY HAS BEEN FORMATTED FOR EASY PRINTING
Q&A

Info warrior

In a new book, James Gleick traces the epic rise of an idea called 'information'

By Joshua Rothman
March 6, 2011

E-mail this article

Invalid E-mail address
Invalid E-mail address

Sending your article

Your article has been sent.

Text size +

We live in an information age, characterized by high speed and high technology. That doesn’t mean, though, that the information age can be defined in terms of technology alone. In his new book, “The Information,” James Gleick argues that it’s actually the idea of information itself that matters most. Gleick tells the story of how the diverse “raw material” of the pre-information age — “letters and messages, sounds and images, news and instructions, figures and facts, signals and signs, a hodgepodge of related species” — came to be re-imagined as one thing: a conceptually integrated stream of “information.” The revolution isn’t about the media (Web browsers, televisions, telephones, or telegrams), but about the way that messages, no matter what they are, can be encoded as information.

Gleick’s account stretches over three millenniums, starting with the invention of writing, moving on to early forms of long-distance communication (the telegraph, the African talking drum), and then arriving at the invention of the computer and the set of networked technologies that shape our lives today. Along the way he introduces and explains the work of thinkers, inventors, and visionaries, some familiar, some less so, from Aristotle to Charles Babbage to Alan Turing. Babbage, a Victorian mathematician and inventor, devised plans for a steam-driven computer made entirely out of gears, pistons, and other mechanical parts; the computer, which he called the “Analytical Engine,” was too complicated to build, but with the help of Lord Byron’s mathematically gifted daughter Ada, Countess of Lovelace, Babbage learned to program it anyway. Turing, an English code-breaker who decoded the German Enigma machine during World War II, worked out the mathematical structures that underlie our modern computers; all computers today are “Turing machines.”

At the book’s center, though, is the American scientist Claude Shannon. As a young engineer working at Bell Labs in the 1940s, Shannon invented information theory. He showed how any kind of information, regardless of its meaning or form, could be understood in mathematical terms as a collection of “bits” (short for “binary digits”) to be encoded, compressed, transmitted, and decoded. This was the crucial insight that united mathematics, engineering, even biology and physics under the banner of “information.”

Gleick argues that the information age arrived on the heels of Shannon’s discovery, when information was “made simple, distilled, counted in bits.” Now there’s nothing that can’t be understood as a kind of information. Books, statistics, and so on have always seemed like kinds of information — but it’s surprising to discover that cells are processors of DNA-based information; that money is a kind of information about “who owns what”; that atoms themselves might be best understood as a kind of information, so that physicists can ask about the universe’s “total information capacity” and “memory space.” At the heart of Gleick’s book is a sense that this view of the world isn’t capricious or arbitrary. The things that exist in the universe really do connect in a layered, networked, symbolic way, and that network is best captured by the idea of information.

Gleick, who will be reading at the Harvard Book Store on March 22 at 6 p.m., spoke to Ideas from his office on Key West, Fla.

IDEAS: Writing a history of information is a huge undertaking. What got you started?

GLEICK: I started thinking about it while I was working on “Chaos,” which was when I first heard about Claude Shannon....Some of these scientists told me about his very important book called “A Mathematical Theory of Communication.” That book has been sitting on my shelf for more than 20 years.

IDEAS: Shannon’s book was published in 1948, yet you begin your story 8,000 years ago, with the invention of writing. Why start then?

GLEICK: So much of my book is about the transition from oral culture to literature culture. We know that writing is important...but some of the things that could not have existed until writing was invented are a sense of history and a sense of logic. You can’t really have logic — you can’t have a syllogism, “if A, then B” — until you can write statements down and examine them, and develop a concept of symbolism, that A stands for something.

IDEAS: Writing and math have been around for thousands of years; Charles Babbage started thinking about computers in 1822. Why was the mid-20th century the flash point?

GLEICK: Look at what everybody was doing, and where they were doing it: at Bell Labs. This was an extraordinary institution; it had evolved over a generation or two in the American corporation that was monopolistically responsible for all of the nation’s telephone traffic. When this company started out having a research department, it was only electrical engineers. It was only after a while that they discovered that mathematicians were also useful sorts of people to have around, because there were a lot of math problems to be solved in dealing with their basic issues, which had to do with the transmission of sounds over electrical wires. And there was a lot of money at stake in solving all of the problems efficiently.

As the number of people using the technology grew from hundreds to thousands to millions to billions, now the problems got really, really interesting. There were network problems, and switching problems....You start out with a single person in front of a plug board...and then you come up with a system that’s going to switch person-to-person communications all across an entire continent.

IDEAS: You write that the concept of “information” that emerged at mid-century revealed the underlying similarity of media that people had thought of as fundamentally, essentially different — words, sounds, and images, for example. How did that happen?

GLEICK: All that different stuff that has individual names, now, collectively, we call it “information,” and in some genuine sense that recognition flows from Claude Shannon’s very technical understanding of information. His technical understanding is boring — it’s dry, it’s deliberately desiccated; in particular, he announces at the outset that the meaning is not interesting to him, it’s not involved — information is something that’s independent of meaning. That should strike us as at least paradoxical, if not crazy, because meaning is what we care about!

But, at the same time, it’s something that we modern people understand, even if we haven’t consciously thought about it: that all these things are information — that a picture is a form of information. Nobody in the 19th century would’ve thought that, but we know it’s true. Part of the reason we know is that we can send it by e-mail, and we know that when we send it by e-mail it’s being encoded in exactly the form that the words are being encoded in, and that sound can be encoded in if we e-mail a snippet of music. In a genuine, fundamental sense, all that stuff is the same, because it’s all made of bits. And that’s Claude Shannon’s lesson.

IDEAS: Is information in the eye of the beholder? Or is it something we discover out there in the world — like a quark?

GLEICK: I actually think information is more likely to be real and fundamental than quarks are. It’s possible that, some day, quarks will turn out to be a human construct, whereas information will never turn out to be that way.

When we say that genes are information at their core, that’s not just a metaphor, that’s not just a clever way of looking at what genes are all about. Genes really are information! That’s literally true. We’ve all learned in school that genes are a particular kind of molecule that has a particular physical shape, the double helix....But it turns out that when you need to talk about genes, whether you’re an evolutionary theorist or a molecular biologist, when you really need to define the stuff that you care about, the stuff you’re manipulating, that stuff isn’t the molecules, it’s the information. It’s the information that really matters. And it seems to be true in physics, too, that information is fundamental to the universe, not just matter and energy — information lies at the heart of things. [The physicist] John Wheeler expresses this with his little motto: “It from bit.”

IDEAS: Do you see the emergence of the information age as one continuous story — from the emergence of written symbols to the discovery of the mathematical principles underlying information?

GLEICK: That’s why I call the book “The Information.” I’m trying to tell one story from start to finish, and that’s a very grand and possibly grandiose idea, but that’s what I’ve tried to do....The history of humanity can be told in terms of wars, in terms of kings, of money — you can look at it from lots of different perspectives — but I believe that the perspective that’s most useful is in terms of information, and of changes in the way people communicate with one another. That’s what transforms human societies more profoundly than anything else.

Joshua Rothman is a graduate student and teaching fellow in the Harvard English department and an instructor in public policy at the Harvard Kennedy School of Government. He teaches novels and political writing.

James Gleick James Gleick