Information

by James Gleick

This is a best-seller about the history of information. Gleick covers every aspect of this gigantic issue, taking a primarily historical approach. For example, an early chapter discusses Charles Babbage, designer of a stupendous mechanical device meant to carry out calculations. In the same chapter, Gleick presents Ada Lovelace, the first person to carry out what we now call “programming”, although her work was for a machine that was never built. Nevertheless, the tale of this fascinating duo is itself worth the price of the book. I think I’ll read a biography of Babbage sometime soon.

It’s difficult to convey the breadth of the book. He covers Maxwell’s Demon, quantum mechanics, Shannon’s huge contributions to the subject, the early attempts at building computers, code making and breaking, Wikipedia, and a host of other topics. 

So my overall evaluation of this book is 5 stars, or two thumbs way up. I definitely learned a great deal from it, although I was already familiar with most of the technical material. 

Having established my esteem for the book, I would now like to register my complaints about it. First, he never really nails down the relationship between thermodynamics and information. He writes extensively about entropy, and he recognizes that the two expressions of entropy — the original thermodynamic definition and Shannon’s later information-based definition — are the same. But he never drives that point home. I don’t think he fully grasps that the mathematical formulations are identical. 

Another key point that he missed is the relationship of information to the directionality of time. The fundamental reason we can’t travel backward in time is that it would permit the creation of information from nothing. If I could go back to Vienna in 1801 and look up Ludwig van Beethoven, I could sneak up behind him and intone “dum dum da DUM” in the form of his Fifth Symphony. So who created Beethoven’s Fifth Symphony? Me? Ludwig? Sorry, that’s just not logical.

I’ll also ding Mr. Gleick for a rather poor explanation of quantum entanglement and quantum computing. I hasten to point out that, with my Masters in Physics, involving several graduate courses in quantum mechanics, I still find both subjects confusing — but I at least grasp the fundamentals behind them. Explaining quantum entanglement and quantum computing without recourse to mathematics would require teaching skills far beyond mine. I suppose that Mr. Gleick had to make the effort, and a heroic effort it is, but it ultimately fails. 

Mr. Gleick does a good job explaining how physics has finally come to embrace the importance of information as a foundational notion behind all physical reality. I’d like to take a moment to make my “I told ya so” here. Forty years ago I submitted a paper to a physics journal making a basic point about this. It was, of course, rejected. And physics has still not yet realized the point that I made all that time ago. But they’re sidling up to it and they’ll figure it out sooner rather than later.

Again, I highly recommend this book. There are some slow sections, but it covers a vast subject well. I had been thinking of writing a book about the nature of information, but this book convinces me that I have nothing significant to add.