# Information

January 1st, 2005 |**Filed under:**Backgrounds | No Comments »

A few years ago discussions of information theory usually started with Claude Shannon:

From Wikipeida:

â€œClaude Elwood Shannon (April 30, 1916 – February 24, 2001) has been called â€œthe father of information theoryâ€?â€¦. He innovated the concept of implementing Boolean algebra with electronic relays and switches in his 1937 MIT master’s thesis, A Symbolic Analysis of Relay and Switching Circuits, and, with it, essentially founded practical digital circuit design. Professor Howard Gardner, of Harvard University, would praise it as â€œpossibly the most important, and also the most famous, master’s thesis of the centuryâ€?, and in 1940 the thesis earned its author the Alfred Nobel American Institute of American Engineers Award. After working in Cold Spring Harbor, under a geneticist, Shannon worked on his PhD in 1940 at MIT. His PhD Thesis is titled â€œAn Algebra for Theoretical Genetics.â€?â€¦ In 1948 he published A Mathematical Theory of Communication (ISBN 0252725484). This work focuses on the problem of how to regain at a target point the information a sender has transmitted. Shannon developed information entropy as a measure for redundancy. His later book with Warren Weaver, The Mathematical Theory of Communication, Univ of Illinois Press, is brief and surprisingly accessible to the non-specialistâ€¦.â€?

For the actual paper, â€œA Mathematical Theory of Communicationâ€? by Claude E. Shannon, go to:

http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html

The key ideas that Shannon introduces include:

1. Information can be measured

2. Information is related to entropy

3. The conveying of information includes a sources, a channel and a receiver

4. Noise is fundamental to the transmission of information

Information Today

Today, one might more likely turn first to Wickipedia, and find that:

From Wikipedia: â€œInformation is a term with many meanings depending on context, but is as a rule closely related to such concepts as meaning, knowledge, instruction, communication, representation, and mental stimulus.

â€œAlthough many people speak of the advent of the â€˜information age,â€™ the â€˜information society,â€™ and information technologies, and even though information science and computer science are often in the spotlight, the word â€œinformationâ€? is often used without careful consideration of the various meanings it has come to acquire.

â€œThe following is a list of the most important meanings, roughly in order of narrowest to broadest.

1. Information as a message

2. Information as a pattern

3. Information as sensory input

4. Information as an influence which leads to a transformationâ€?

In other words, we now understand Information to cover a very wide range of concepts. Perhaps most important in understanding information is context. Suppose you wander into the wrong classroom and the instructor is talking about Hilbert Space and putting equations on the blackboard, and you know nothing about advanced mathematics. Your first thought might be that no information is conveyed. But actually that is not correct. You now know that the instructor is alive, that they speak English, and that you are probably in a math class. All of which is already a lot of information, and most of which has to do with context.

Getting Stated With Information Theory

So how do we get up to speed on information theory? I would work through the following:

1. Written in 1966, Singh’s Great Ideas in Information Theory, Language and Cybernetics is out of date, but still an excellent introduction to the field.

2. Next I would read chapter 2 of Brown’s The Quest for a Quantum Computer for recent developments in information theory. If you want to go deeper, read the entire book. It is one of the most important overviews of contemporary information theory you will find.

3. Then read David Deutschâ€™s Fabric of Reality. It is one of the most important books of our time.

3. Then I would play aroundâ€”perhaps re-read Douglas R. Hofstadter’s GÃ¶del, Escher, Bach. Robert Wright’s Three Scientists and Their Gods, presenting a lively introduction to the ideas of Edward Fredkin, Edward O. Wilson, and Kenneth Boulding. Wheeler’s Geons, Black Holes, and Quantum Foam. B. Roy Frieden’s Physics from Fisher Information (You can get the introduction on Amazon. The rest requires graduate math, so skip unless you can handle it. Also peruse the criticism of Frieden on the Web.). And of course Claude E. Shannon’s 1948 paper, which created the field of information theory.

The Universe As Information

Much of physics and cosmology now thinks of information as ranking with matter and energy as a fundamental property of the universe. With this ranking comes the notion that information can be transformed (including to and from matter and energy) but it cannot be destroyed. So what happens if you toss an encyclopedia into a black hole? Does the information in the encyclopedia get destroyed? In dealing with the question, Hawking and others have come up with the notion that the information content of a black hole is determined by its surface area, more specifically the number of units of Plank areas (the smallest possible area in our quantum granular world) on the surface of the black hole.

This leads to speculation that the universe is constituted of 2D membranes (i.e. information), and that our 3D world is a holographic projection from such a membrane.

If information is the fundamental constituent of reality, there might be implications for how we understand every aspect of reality, from physics to architecture.

Here are some approaches to the universe as information:

Programming the Universe: A Quantum Computer Scientist Takes On the Cosmos

by Seth Lloyd

Understanding the universe as information processing.

Edward Fredkin

Fredkin was an early pioneer of digital physics. His main contributions include his work on reversible computation (which presents a new solution to that Maxwellâ€™s Demon paradox, and theoretically uses no energy) and cellular automata. Fredkin maintains that the universe is a computer. Not that the universe can be better understood through the metaphor of computing, but that it is a computer. You can read more on his web site:

Digital Philosophy

â€œDigital Philosophy (DP) is a new way of thinking about the fundamental workings of processes in nature. DP is an atomic theory carried to a logical extreme where all quantities in nature are finite and discrete. This means that, theoretically, any quantity can be represented exactly by an integer. Further, DP implies that nature harbors no infinities, infinitesimals, continuities, or locally determined random variables. This paper explores Digital Philosophy by examining the consequences of these premisesâ€¦.â€?

Physics From Fisher Information

B. Roy Frieden was working enhancing satellite photos, when he began to wonder, what is the theoretical limit of the information that can be extracted from a fuzzy photo? That led him to Fisher Information, a branch of statistical theory, and then to the notion that all of physics could be redone from Fisher Information.

Physics from Fisher Information: A Unification

by B. Roy Frieden

From Amazon: â€œThis book defines and develops a unifying principle of physics, that of â€˜extreme physical information.â€™ Fisher information is a simple concept little known to physicists. The book develops statistical and physical properties of Fisher information. This information is a physical measure of disorder, sharing with entropy the property of monotonic change with time. The information concept is applied â€œphenomenallyâ€? to derive most known physics, from statistical mechanics and thermodynamics to quantum mechanics, the Einstein field equations, and quantum gravityâ€¦.â€?

Reviews of Physics from Fisher Information: A Unification by Frieden:

â€œâ€¦This is a compilation of Roy Frieden’s work in major physics journals over the last decade deriving the basic laws of physics – relativistic quantum mechanics, electromagnetism, gravitation, statistical thermodynamics – from a quantity (used by mathematical statisticians and by hardly anyone else) called Fisher Information. He derives the Klein-Gordon equation, Schroedinger wave equations, Maxwell’s equations, DeWitt-Wheeler law of quantum gravity, and various statistical thermodynamics lawsâ€¦.â€?

A negative review:http://cscs.umich.edu/~crshalizi/reviews/physics-from-fisher-info/

Stephen Wolfram

In brief, Stephen Wolfram contends that Newton made a mistake when he sought to understand the universe through mathematicsâ€”numbers. Instead, Wolfram contends that we should understand the universe through rules like those in computer programs, particularly cellular automata programs.

For more on Wolfram, go to: http://www.stephenwolfram.com/

A New Kind of Science

by Stephen Wolfram

From Amazon, from Library Journal: â€œGalileo proclaimed that nature is written in the language of mathematics, but Wolfram would argue that it is written in the language of programs and, remarkably, simple ones at that. A scientific prodigy who earned a doctorate from Caltech at age 20, Wolfram became a Nobel-caliber researcher in the emerging field of complexity shortly thereafter only to abscond from academe and establish his own software company (which published this book). In secrecy, for over ten years, he experimented with computer graphics called cellular automata, which produce shaded images on grid patterns according to programmatic rules (973 images are reproduced here). Wolfram went on to discover that the same vastly complex images could be produced by even very simple sets of rules and argues here that dynamic and complex systems throughout nature are triggered by simple programs. Mathematical science can describe and in some cases predict phenomena but cannot truly explain why what happens happens. Underscoring his point that simplicity begets complexity, Wolfram wrote this book in mostly nontechnical language. Any informed, motivated reader can, with some effort, follow from chapter to chapter, but the work as a whole and its implications are probably understood fully by the author alone. Had this been written by a lesser scientist, many academics might have dismissed it as the work of a crank. Given its source, though, it will merit discussion for years to come. Essential for all academic libraries. [This tome is a surprise best seller on Amazon. Ed.] Gregg Sapp, Science Lib., SUNY at Alban. – Gregg Sapp, Science Lib., SUNY at Albany. Copyright 2002 Cahners Business Information, Inc.â€?

Some negative comments on Wolframâ€™s book have surfaced, claiming that much of what he says is not new, and that he does not adequately credit others. The bottom line is that you should read the book. It is a fantastic education in all of contemporary science, physics, computation, and information theory, and it introduces numerous new ideas.

More on Cellular Automata

From: http://www.brunel.ac.uk/depts/AI/alife/al-ca.htm

â€œCellular automata are discrete dynamical systems whose behaviour is completely specified in terms of a local relation. A cellular automaton can be thought of as a stylised universe. Space is represented by a uniform grid, with each cell containing a few bits of data; time advances in discrete steps and the laws of the â€˜universeâ€™ are expressed in, say, a small look-up table, through which at each step each cell computes its new state from that of its close neighbours. Thus, the system’s laws are local and uniformâ€¦. The first cellular automaton was conceived by Von Neumann in the late fortiesâ€¦.â€?

Black Holes and Information

â€œIn 1997, the three cosmologists made a famous bet as to whether information that enters a black hole ceases to exist — that is, whether the interior of a black hole is changed at all by the characteristics of particles that enter it. Hawkingâ€™s research suggested that the particles have no effect whatsoever. But his theory violated the laws of quantum mechanics and created a contradiction known as the â€˜information paradox.â€™â€?

From: http://researchnews.osu.edu/archive/fuzzball.htm

See Stephen Hawkingâ€™s web site at http://www.hawking.org.uk/home/hindex.html

For the latest, see:

NewScientist.com, July 14, 2004

â€œAfter nearly 30 years of arguing that a black hole destroys everything that falls into it, Stephen Hawking is saying he was wrong. It seems that black holes may after all allow information within them to escape.

It might solve one of the long-standing puzzles in modern physics, known as the black hole information paradox. In 1976, he calculated that once a black hole forms, it starts losing mass by radiating energy. This â€˜Hawking radiationâ€™ contains no information about the matter inside the black hole and once the black hole evaporates, all information is lost.

But this conflicts with the laws of quantum physics, which say that such information can never be completely wiped out. Hawking’s argument was that the intense gravitational fields of black holes somehow unravel the laws of quantum physicsâ€¦.â€?

For full article:

http://www.newscientist.com/news/print.jsp?id=ns99996151

The Truth Is Still Out There

In an op-ed piece in The New York Times on August 3, 2004, Paul Ginsparg, professor of physics and information science at Cornell University, describing the background of the issues:

â€œâ€¦ Near the end of a small meeting I attended in 1993, the question of â€˜What happens to information that falls into a black hole?â€™ arose, and a democratic method was chosen to address it. The vote proceeded more or less along party lines, with the general relativists firm in their adherence to causality, and the quantum field theorists equally adamant in their faith in unitarity. Of the 77 participants, 25 voted for the category â€˜It’s lost;â€™ and 39, a slight majority, voted for â€˜It comes out,’â€™ (that it re-emerges). Seven voted that the black hole would not evaporate entirely, and the remaining six voted for an unspecified â€˜Something else.â€™ â€¦â€?

## Leave a Reply