The Information: A History, a Theory, a Flood. James Gleick

The Information: A History, a Theory, a Flood - James  Gleick


Скачать книгу
its first mathematician in 1897: George Campbell, a Minnesotan who had studied in Göttingen and Vienna. He immediately confronted a crippling problem of early telephone transmission. Signals were distorted as they passed across the circuits; the greater the distance, the worse the distortion. Campbell’s solution was partly mathematics and partly electrical engineering. His employers learned not to worry much about the distinction. Shannon himself, as a student, had never been quite able to decide whether to become an engineer or a mathematician. For Bell Labs he was both, willy-nilly, practical about circuits and relays but happiest in a realm of symbolic abstraction. Most communications engineers focused their expertise on physical problems, amplification and modulation, phase distortion and signal-to-noise degradation. Shannon liked games and puzzles. Secret codes entranced him, beginning when he was a boy reading Edgar Allan Poe. He gathered threads like a magpie. As a first-year research assistant at MIT, he worked on a hundred-ton proto-computer, Vannevar Bush’s Differential Analyzer, which could solve equations with great rotating gears, shafts, and wheels. At twenty-two he wrote a dissertation that applied a nineteenth-century idea, George Boole’s algebra of logic, to the design of electrical circuits. (Logic and electricity—a peculiar combination.) Later he worked with the mathematician and logician Hermann Weyl, who taught him what a theory was: “Theories permit consciousness to ‘jump over its own shadow,’ to leave behind the given, to represent the transcendent, yet, as is self-evident, only in symbols.”

      In 1943 the English mathematician and code breaker Alan Turing visited Bell Labs on a cryptographic mission and met Shannon sometimes over lunch, where they traded speculation on the future of artificial thinking machines. (“Shannon wants to feed not just data to a Brain, but cultural things!” Turing exclaimed. “He wants to play music to it!”) Shannon also crossed paths with Norbert Wiener, who had taught him at MIT and by 1948 was proposing a new discipline to be called “cybernetics,” the study of communication and control. Meanwhile Shannon began paying special attention to television signals, from a peculiar point of view: wondering whether their content could be somehow compacted or compressed to allow for faster transmission. Logic and circuits cross-bred to make a new, hybrid thing; so did codes and genes. In his solitary way, seeking a framework to connect his many threads, Shannon began assembling a theory for information.

      The raw material lay all around, glistening and buzzing in the landscape of the early twentieth century, letters and messages, sounds and images, news and instructions, figures and facts, signals and signs: a hodgepodge of related species. They were on the move, by post or wire or electromagnetic wave. But no one word denoted all that stuff. “Off and on,” Shannon wrote to Vannevar Bush at MIT in 1939, “I have been working on an analysis of some of the fundamental properties of general systems for the transmission of intelligence.” Intelligence: that was a flexible term, very old. “Nowe used for an elegant worde,” Sir Thomas Elyot wrote in the sixteenth century, “where there is mutuall treaties or appoyntementes, eyther by letters or message.” It had taken on other meanings, though. A few engineers, especially in the telephone labs, began speaking of information. They used the word in a way suggesting something technical: quantity of information, or measure of information. Shannon adopted this usage.

      For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague—force, mass, motion, and even time—and gave them new meanings. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just as soft and inclusive a term as information. For Aristotelians, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematicized it, giving energy its fundamental place in the physicists’ view of nature.

      It was the same with information. A rite of purification became necessary.

      We can see now that information is what our world runs on: the blood and the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level—an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions. . . . If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.

      “The information circle becomes the unit of life,” says Werner Loewenstein after thirty years spent studying intercellular communication. He reminds us that information means something deeper now: “It connotes a cosmic principle of organization and order, and it provides an exact measure of that.” The gene has its cultural analog, too: the meme. In cultural evolution, a meme is a replicator and propagator—an idea, a fashion, a chain letter, or a conspiracy theory. On a bad day, a meme is a virus.

      Economics is recognizing itself as an information science, now that money itself is completing a developmental arc from matter to bits, stored in computer memory and magnetic strips, world finance coursing through the global nervous system. Even when money seemed to be material treasure, heavy in pockets and ships’ holds and bank vaults, it always was information. Coins and notes, shekels and cowries were all just short-lived technologies for tokenizing information about who owns what.

      And atoms? Matter has its own coinage, and the hardest science of all, physics, seemed to have reached maturity. But physics, too, finds itself sideswiped by a new intellectual model. In the years after World War II, the heyday of the physicists, the great news of science appeared to be the splitting of the atom and the control of nuclear energy. Theorists focused their prestige and resources on the search for fundamental particles and the laws governing their interaction, the construction of giant accelerators and the discovery of quarks and gluons. From this exalted enterprise, the business of communications research could not have appeared further removed. At Bell Labs, Claude Shannon was not thinking about physics. Particle physicists did not need bits.

      And then, all at once, they did. Increasingly, the physicists and the information theorists are one and the same. The bit is a fundamental particle of a different sort: not just tiny but abstract—a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence. Bridging the


Скачать книгу