On vacation in February, I read James Gleick’s The Information: A History, a Theory, a Flood. It may be my favorite nonfiction book of the past year (and potential Pulitzer winner). I have recommended other book’s of Gleick’s, notably the informative and powerful Chaos — this book is every bit as powerful. It is a tour-de-force of history, and introduction into information theory for non-scientists. Only rarely does it get a little lost in the weeds. I found it a deeply satisfying read.
It’s out in paperback now.
Here’s Cory Doctorow’s comments:
I’ve been fascinated with information theory since a friend of a friend explained “Shannon limits” to me in the late 1990s. I remember the conversation, mostly because the description was tantalizingly frustrating and incomplete, this being a hallmark of really interesting ideas. This friend of a friend explained that there were theoretical limits to how much information any channel could carry, and that these limits included rigorous definitions for “channel” and “information.” I’ve read up on Claude Shannon rather a lot since (I’ve got a short story called Shannon’s Law in an upcoming Borderlands book, about a hacker named Shannon Klod who tries to violate the barrier between faerie and the human realm by routing a single packet using TCP-over-magic) and every time I do, it’s a revelation, because some new facet of information theory reveals itself to me.
But nothing has presented these ideas half so well as The Information, and that’s a tribute to Gleick’s storytelling mastery, his ability to pick out the threads of history that trace back and forward from the discipline’s central thesis. Gleick begins with early lexicographers, the primitive dictionaries, the phrasebooks that translated between the talking drum and western speech. He moves onto Babbage and Lovelace (and presents an account of their invention, rivalries, victories and failings that is as heartbreaking as it is informative), and then into telegraphy.
Telegraphy leads to codes, and codes to compression, and compression to logic, and logic to the first inklings of theories, and now you’ve got Einstein and Godel and Shannon and Turing meeting, debating, fighting and rubbishing each other in learned journals, arguing furiously with Margaret Mead at interdisciplinary conferences — a pellmell debate in full swing. On Gleick marches, to the double helix and Dawkins and memes, to a section on randomness that is so transcendently exciting that I couldn’t put the book down and read it while walking, so distracted I got lost twice within blocks of my office.
Gleick takes us through Wikipedia and the meaning of information, the debates about it, the helpelessness of information overload, the collisions in namespaces — even through his beloved chaos math — until he has spun out his skeins so that they wrap around the world and the universe, information theory at the heart of legal debates over trademark, physics feuds over Hawking radiation, epistemology and cryptography, even fights over Pokemon characters and their disambiguation.
If you are looking for a non finance non fiction book to read, The Information: A History, a Theory, a Flood is my recommendation for the top of your queue.
Please use the comments to demonstrate your own ignorance, unfamiliarity with empirical data and lack of respect for scientific knowledge. Be sure to create straw men and argue against things I have neither said nor implied. If you could repeat previously discredited memes or steer the conversation into irrelevant, off topic discussions, it would be appreciated. Lastly, kindly forgo all civility in your discourse . . . you are, after all, anonymous.