Urgent message

PUBLISHED : Sunday, 17 April, 2011, 12:00am
UPDATED : Sunday, 17 April, 2011, 12:00am
 

The Information: A History, a Theory, a Flood
by James Gleick
Pantheon, HK$400

Most of us think of information as facts that we can use - a weather report, a set of instructions on how to fix a car engine - but as James Gleick's masterful The Information makes clear, there are many more levels to it than that.

Information, for instance, is at the genesis of us all, in our genes. It's the thing that makes us what we are. Information doesn't even have to contain any meaning. The scientific discipline of information theory, which takes up a good chunk of this 500-page work, is founded on the idea that information is simply something that is transmitted from A to B through C.

One page of Shakespeare, 10 pages of random letters, or 100 units of anything - all are information when telegraphed, recited down a telephone or e-mailed. According to Gleick, information is everywhere, in everything. His book, in keeping with the current scientific thinking, puts information not only at the core of life, but at the centre of the entire cosmos.

Gleick is one of the few science writers who can write a book for the general reader which is scientific enough to interest science undergraduates - Hyperspace's Michio Kaku and The Elegant Universe's Brian Greene are others. Gleick's earlier Chaos succeeded in making chaos theory understandable to the layman.

In this book he undertakes the equally daunting tasks of defining what information is in all its myriad forms and then chronicling how it has been passed from human to human, and at a more fundamental level, from cell to organism, throughout history. As befits such a vast subject matter, his observations range far and wide: from talking drums and the invention of the alphabet, to the calculating machines of the steam age, the dawn of information theory in the mid-20th century, genetic coding, quantum physics, and, as a seeming afterthought, the internet.

Although the book is couched as a history it includes a lot of scientific theory to give it weight: for the non-scientific reader, unfortunately, that means a lot of maths, the only language capable of communicating modern scientific theories. A third strand of the book is biographical, with rich accounts of the lives and thoughts of groundbreaking but lesser-known information scientists such as Charles Babbage and Claude Shannon.

Gleick's primary mission is to explain information theory. This lies at the heart of modern-day computing, but few of us know what it is. Gleick fulfils this in an ambitiously expansive way. First he tries to explain how languages - both verbal and mathematical - are structured. Then he analyses how they are used to transmit information in the natural world, in speech or writing, for instance.

From this foundation he goes on to examine how scientists have used the resulting knowledge to build effective communication devices such as the telegraph and the telephone, or in the case of mathematical languages, the logarithm book and the computer.

A quick way of understanding one of the key language issues that Gleick explores is to pick up a mobile phone and switch to the text function. If you're a habitual texter, you'll probably text 'C u later' instead of 'See you later'. The contraction of 'You' to 'u' is an example of redundancy - the 'Y' and 'o' are omitted because the meaning can still be understood without them. They are redundant letters.

The idea of redundancy did not come about with the cell phone, as is commonly thought. The redundancy of language has been a mainstay of communication and information theorists for years. Less letters means less information, and that generally leads to faster transmission times.

The idea of cutting down on information without losing meaning is just a part of the story, but it goes hand in hand with the development of communications technology. It's also one of the more accessible ideas explored in the book. A fascinating chapter on the invention of the telegraph describes an early example. The telegraph was invented in France in the 1790s: messages were transmitted by a chain of telegraph stations staffed by operators with binoculars.

To transmit the messages the operators adjusted the positions of the two crane-like signals on top of each station to represent coded letters and words. It was too laborious to adjust the signals for every letter, so big code books were designed containing signals for whole phrases, sentences, and so on.

The electric telegraph, which used Morse code, led to another form of contraction. It arrived in the 1850s, changing people's worldview forever - it was the first time that almost instantaneous international communication was possible, and the globe shrunk as a result. But long messages were expensive, so senders evolved a shorthand to save cash: 'So powerful was that impulse [to save money] that the English prose style seemed to be feeling the effects,' writes Gleick.

Gleick starts the main story a quarter into the book with the introduction of Claude Shannon, the American scientist and mathematician who developed information theory.

Here the book takes a turn for the mathematical, although Shannon's adventures in coding - he was a code-breaker in the second world war - telephony, maths and computing are much more interesting than they sound. Information theory can be defined quite simply as finding out the best way for a sender to transmit the message he or she wants to transmit.

Unfortunately, the reality of doing this is much more complicated than the definition. Shannon studied the statistical structure of language to work out what could be lost from messages while keeping their meaning. Shannon worked out mathematical equations to express his findings. His work laid the foundations of binary computer code, and also gave us the word 'bit' in its computing context as a measure of information.

Shannon's information theory may sound as if it only applies to computer programmers. But it has influenced our lives in many fundamental ways. Gene theory, as developed by evolutionary biologist Richard Dawkins, draws on Shannon's discoveries. The idea of a gene containing the information for the future development of the organism has its roots in Shannon's work.

Information theory has also influenced theoretical physics, which deals with big issues such as the birth of the universe and the nature of time.

Share

Send to a friend

To forward this article using your default email client (e.g. Outlook), click here.

Urgent message

Enter multiple addresses separated by commas(,)

For unlimited access to:

SCMP.com SCMP Tablet Edition SCMP Mobile Edition 10-year news archive