The Information: A History, a Theory, a Flood

Overview

James Gleick traces the concept of information from African talking drums and the invention of writing through to Claude Shannon’s mathematical theory and the digital age, arguing that information is not merely a human convenience but a fundamental quantity woven into the fabric of reality. The book moves chronologically yet thematically, showing how each revolution in how we encode, transmit, and store information — alphabets, printing, telegraphy, computation — reshaped science, culture, and our understanding of the physical world. Gleick ultimately positions information alongside matter and energy as a foundational pillar of nature.

Key Concepts

From Drums to Alphabets — Early Encoding

  • Redundancy in oral cultures — African talking drums transmitted messages over vast distances by exploiting the tonal structure of languages, adding formulaic phrases to overcome ambiguity — an intuitive application of redundancy that Shannon would later formalise
  • The alphabet as discretisation — reducing spoken language to a small set of abstract symbols made information portable, storable, and manipulable, enabling literacy, record-keeping, and eventually logic
  • Writing as external memory — the invention of script offloaded cognitive storage from human brains to physical media, fundamentally changing how knowledge could accumulate across generations

Shannon’s Information Theory

  • The bit as fundamental unit — Claude Shannon defined information as the reduction of uncertainty, measurable in binary digits (bits), separating the concept of information from meaning and grounding it in mathematics
  • Entropy and information — Shannon showed that the information content of a message is related to its statistical surprise; highly predictable messages carry little information, establishing a formal link between information theory and thermodynamic entropy
  • Channel capacity and noise — the noisy-channel coding theorem proved that reliable communication is possible even over imperfect channels, provided the transmission rate stays below a calculable capacity — the theoretical foundation of all modern digital communication

Information in Physics and Biology

  • Maxwell’s demon and thermodynamics — the thought experiment linking information to physical entropy was resolved by Landauer’s principle: erasing a bit of information necessarily dissipates a minimum amount of energy, proving information is physical
  • DNA as biological information — genetics can be reframed as an information system: DNA stores, copies, and transmits hereditary instructions using a four-letter code, subject to noise (mutation) and error-correction (DNA repair enzymes)
  • Quantum information — the book touches on how quantum mechanics complicates the classical picture, with qubits existing in superposition and entanglement enabling correlations that have no classical information-theoretic analogue

The Information Flood

  • Exponential growth — the total amount of stored information has grown super-exponentially since the mid-20th century, driven by digitisation, the internet, and sensor proliferation
  • Signal versus noise — more information does not automatically mean more knowledge; filtering, compression, and search algorithms become essential when data outstrips human cognitive bandwidth
  • Randomness and incompressibility — Gleick explores Kolmogorov complexity and algorithmic information theory, where a truly random string contains maximal information because it cannot be compressed — a counter-intuitive result that deepens the concept of what information really is

Personal Reflection

[To be added]

  • Chaos - Gleick’s earlier book; together they trace deterministic chaos to information entropy
  • The Gene - DNA as an information system — genetics as one instance of Gleick’s broader argument
  • The Science of Can and Can’t - Marletto redefines information as a physical counterfactual property — a radical extension

Parent: Books