This article is more than 1 year old

A Brief History of Information

The word that means everything - and nothing

Shannon's PhD dissertation An Algebra for Theoretical Genetics - an application of his "queer algebra," in the words of Vannevar Bush - was written at MIT in 1940 under the direction of Barbara Burks, an employee of Eugenics Record Office at Cold Spring Harbor Laboratory. Shannon was then recruited by Bell Labs to research "fire-control systems" - automated weapon targeting and activation - "data smoothing" and cryptography during World War II.

At no point in his works did Shannon ever define "information"; instead, he offered a model of how to quantitatively measure the reduction of uncertainty in transmitting a communication, and used "information" to describe that measure.

Double negatives

Shannon's two-part article in 1948, A Mathematical Theory of Communication and its subsequent reprinting with a popularizing explanation in his and Warren Weaver's book The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949), are widely heralded as the founding moment of what has since come to be known as "information theory," a subdiscipline of applied mathematics dealing with the theory and practice of quantifying data.

Shannon's construction, like those of Nyquist and Hartley, took as its context the problem presented by electronic communications, which by definition are "noisy", meaning that a transmission does not consist purely of intentional signals. The problem they pose is how to distinguish the intended signal from the inevitable artifacts of the systems that convey it - or, in Shannon's words, how to "reproduc[e] at one point either exactly or approximately a message selected at another point."

Shannon was especially clear that he didn't mean meaning:

Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.

In The Mathematical Theory of Communication, he and Weaver explained that "information is a measure of one's freedom of choice when one selects a message" from a universe of possible solutions. In everyday usage, "freedom" and "choice" are usually seen as desirable: the more, the better. However, in trying to decipher a message they have a different consequence: the more freedom of choice one has, the more ways one can render the message - and the less sure one can be that a particular reproduction is accurate.

Put simply, the more freedom one has, the less one "knows."

It's small wonder that the author of such a theory would see efforts to apply it in other fields as "suspect".

The Fog of the New Machine

Of course, if Shannon sought to limit the application of his "information" to specific technical contexts - for example, by warning in a popularizing 1949 book that "[t]he word information, in this theory, is used in a special sense that must not be confused with its ordinary usage" - he failed miserably. The applications of his work in computational and communication systems, ranging from intimate read-write operations in storage devices to the principles guiding the design of sprawling networks, have had pervasive and catastrophic effects since their publication.

As this account suggests - and as one should expect - Shannon's work was just one result of many interwoven conceptual and practical threads involving countless researchers and practitioners working across many fields and disciplines.

In the two decades that separated Hartley's 1928 article and Shannon's publications, myriad advances had already had immense practical impact - for example, on the conduct and outcome of World War II, in fields as diverse as telegraphy, radiotelegraphy, electromechanical systems automation and synchronization, and cryptography. More generally, an important aspect and a notable result of that war were the unparalleled advances in systems integration across government, industry, and academia, from basic research through procurement, logistics, and application.

In the sixty years since, those advances have spawned many more advances - quite enough reason for "nonspecialists" to take a strong interest in information, however it is defined. Their interests, and the "popular" descriptions that result, surely carry at least as much weight as Shannon's mathematical prescription.

In the next part we'll look at more recent, popular rhetoric about "information" lines up with its ancient origins. And discover how today's philosophers make for strange bedfellows with the sloppy purveyors of post-modernist marketing.®

Ted Byfield is Associate Chair of the Communication Design and Technology Department at Parsons the New School for Design in New York City; he co-moderates the Nettime-l mailing list. This article is based on an essay in Matthew Fuller (ed.), Software Studies: A Lexicon (Cambridge, Mass: MIT Press, forthcoming 2007).

More about

TIP US OFF

Send us news


Other stories you might like