January 1st, 2006 | Filed under: Backgrounds | No Comments »
The Analytical Engine
From â€œcomputersâ€? in the EncyclopÃ¦dia Britannica online:
â€œWhile working on the Difference Engine, Babbage began to imagine ways to improve it. Chiefly he thought about generalizing its operation so that it could perform other kinds of calculations. By the time the funding had run out in 1833, he had conceived of something far more revolutionary: a general-purpose computing machine called the Analytical Engine. The Analytical Engine was to be a general-purpose, fully program-controlled, automatic mechanical digital computer. It would be able to perform any calculation set before itâ€¦. Augusta Ada King, the countess of Lovelace, was the daughter of the poet Lord George Gordon Byron and the mathematically inclined Anne Millbanke. One of her tutors was Augustus De Morgan, a famous mathematician and logician. â€¦ Lady Lovelace attended Babbage’s soirees and became fascinated with his Difference Engine. â€¦ She went on to become the world’s only expert on the process of sequencing instructions on the punched cards that the Analytical Engine usedâ€”that is, she became the world’s first computer programmer.â€?
From Wikipedia: â€œIn current usage, a computer is a device which is used to process information according to a well-defined procedure.
The word was originally used to describe people who were employed to do arithmetic calculations, with or without mechanical aids. The famed Gottfried Wilhelm von Leibniz himself complained of the time he expended in performing calculations. Starting in the 1950s computing machine was used to refer to the machines themselves; finally, the shorter word computer took over the term computing machine. Originally, computing was almost exclusively related to arithmetical problems, but modern computers are used for many tasks unrelated to mathematics, as their cost has declined, their performance has increased and their size is reduced.
However, the above definition includes many special-purpose devices that can compute only one or a limited range of functions. When considering modern computers, their most notable characteristic that distinguishes them from earlier computing devices is that, given the right programming, any computer can emulate the behaviour of any other (limited only by storage capacity and execution speed), and, indeed, it is believed that current machines can emulate any future computing devices we invent (though undoubtedly more slowly). In some sense, then, this threshold capability is a useful test for identifying â€œgeneral-purposeâ€? computers from earlier special-purpose devices. This â€œgeneral-purposeâ€? definition can be formalised into a requirement that a certain machine must be able to emulate the behaviour of a universal Turing machine. Machines meeting this definition are referred to as Turing-complete. While such machines are physically impossible as they require unlimited storage and zero crashing probability, the attribute Turing-complete is sometimes also used in a lax sense for machines that would be universal if they had more (infinite) storage and were absolutely reliable. The first such machine appeared in 1941: the program-controlled Z3 of Konrad Zuse (but its Turing-completeness was shown only much later, namely, in 1998). Other machines followed in a flurry of developments around the world. See the history of computing article for more details of this period.â€?
From Wikipedia: â€œThe Turing machine is an abstract model of computer execution and storage introduced in 1936 by Alan Turing to give a mathematically precise definition of algorithm or ‘mechanical procedure’. As such it is still widely used in theoretical computer science, especially in complexity theory and the theory of computation. The thesis that states that Turing machines indeed capture the informal notion of effective or mechanical method in logic and mathematics is known as the Church-Turing thesis.
The concept of the Turing machine is based on the idea of a person executing a well-defined procedure by changing the contents of an infinite amount of ordered paper sheets that can contain one of a finite set of symbols. The person needs to remember one of a finite set of states and the procedure is formulated in very basic steps in the form of â€œIf your state is 42 and the symbol you see is a ‘0’ then replace this with a ‘1’, remember the state 17, and go to the following sheet.
Turing machines shouldn’t be confused with the Turing test, Turing’s attempt to capture the notion of artificial intelligence.
A Turing machine that is able to simulate any other Turing machine is called a universal Turing machine or simply a universal machine as Turing described it in 1947:
It can be shown that a single special machine of that type can be made to do the work of all. It could in fact be made to work as a model of any other machine. The special machine may be called the universal machine.â€?
What Are the Implication of the Universal Turing Machine?
The Fabric of Reality: The Science of Parallel Universes-And Its Implications
by David Deutsch
From Amazon: â€œIn The Fabric of Reality, Deutsch traces what he considers the four main strands of scientific explanation: quantum theory, evolution, computation, and the theory of knowledge.â€?
Computationalism: New Directions
by Matthias Scheutz (Editor)
From the dust jacket: â€œClassical computationalismâ€”the view that mental states are computational statesâ€”has come under attack in recent years. Critics claim that in defining computation solely in abstract, syntactic terms, computationalism neglects the real-time, embodied, real-world constraints with which cognitive systems must cope. Instead of abandoning computationalism altogether, however, some researchers are reconsidering it, recognizing that real-world computers, like minds, must deal with issues of embodiment, interaction, physical implementation, and semantics. This book lays the foundation for a successor notion of computationalism. It covers a broad intellectual range, discussing historic developments of the notions of computation and mechanism in the computationalist model, the role of Turing machines and computational practice in artificial intelligence research, different views of computation and their role in the computational theory of mind, the nature of intentionality, and the origin of language.â€?
Turing’s Man: Western Culture in the Computer Age
by J. David Bolter
From Amazon, by Ivan Askwith: â€œAlthough Turing’s Man is a little bit dated — it was published in 1984, before the internet had even taken on a significant presence in modern life, it suggests and foreshadows a number of themes which have become more prominent since the text was printed. Beginning with an overview and survey of technological evolution, from the Ancient World right up through the present, Bolter does a fine job of articulating the complex process through which technology changes and is changed by the society into which it is introduced.â€?
Age of Spiritual Machines
by Ray Kurzweil
From Barns and Noble: â€œAfter establishing that technology is growing exponentially, Kurzweil forecasts that computers will exceed the memory capacity and computing speed of the human brain by 2020, with the other attributes of human intelligence not far behind.â€?
Machines Who Think: A Personal Inquiry into the History and Prospects of Artificial Intelligence
by Pamela McCorduck
From Amazon, from Scientific American: â€œThe enormous, if stealthy, influence of AI bears out many of the wonders foretold 25 years ago in Machines Who Think, Pamela McCorduckâ€™s groundbreaking survey of the history and prospects of the field. A novelist at the time (she has since gone on to write and consult widely on the intellectual impact of computing), McCorduck got to the founders of the field while they were still feeling their way into a new science.â€?
January 1st, 2005 | Filed under: Backgrounds | No Comments »
We are today in the midst of a technological upheaval greater then any we have seen in human history, including developments in information, computers, AI, virtual reality, material science, nanotechnology, biotechnology, genetic engineering, brain and consciousness research, nanotechnology, virtual reality, robotics, etc. To put it simply, we are now beginning to be able to control matter (or should we say reality) on the most fundamental particle levels. And, if Hugh Everett, Bryce DeWitt, and David Deutsch are right, our ability to manipulate quantum phenomena gives us windows into parallel universes.
These technology fields are so vast that there is no way even to outline them, so for now we will list some major sources of information on cutting edge technologies, and in the future we will provide links to material that addresses the implications of these technologies for architecture.
Much of MITâ€™s course material online.
From the Web Site: â€œA free and open educational resource for faculty, students, and self-learners around the world. OCW supports MIT’s mission to advance knowledge and education, and serve the world in the 21st century. It is true to MIT’s values of excellence, innovation, and leadership. MIT OCW:
Is a publication of MIT course materials
Does not require any registration
Is not a degree-granting or certificate-granting activity
Does not provide access to MIT facultyâ€?
The most comprehensive source for both background and daily news breaks in the technology fields that will be changing our lives. Check in daily for tech news and subscribe to their email newsletter.
From the Web Site: â€œKurzweilAI.net features the big thoughts of today’s big thinkers examining the confluence of accelerating revolutions that are shaping our future world, and the inside story on new technological and social realities from the pioneers actively working in these arenas. We are witnessing intersecting revolutions in a plethora of fields: biotechnology, nanotechnology, molecular electronics, computation, artificial intelligence, pattern recognition, virtual reality, human brain reverse engineering, brain augmentation, robotics, and many others. The leading visionaries represented on this site examine these transforming trends and their profound impact on economics, the arts, politics, government, warfare, medicine, health, education, disabilities, social mores, and sexuality.â€?
This is where Eric Drexler and Ralph Merkle, the pioneers of nanotechnology, hang out.
â€œForesight Institute’s goal is to guide emerging technologies to improve the human condition. Foresight focuses its efforts upon nanotechnology, the coming ability to build productsâ€”of any sizeâ€”with atomic precision.â€?
There are many sites devoted to quantum computing, but for now, we will recommend Deutsch’s:
John Brockmanâ€™s site. â€œTo arrive at the edge of the worldâ€™s knowledge, seek out the most complex and sophisticated minds, put them in a room together, and have them ask each other the questions they are asking themselves.â€?
January 1st, 2005 | Filed under: Backgrounds | 5 Comments »
We need to distinguish between â€œpost humanismâ€? and â€œpost human.â€? Post humanism refers to a change in culture, along the lines of the change from the Middle Ages to the Renaissance, or from the Baroque to the Enlightenment. Post human, on the other hand refers to the possibility that, due to new technologies, we are on the verge of becoming no longer human, that is to say a different species.
Humanism might be described as the notion that the human being is the central entity in the cosmos, standing above God and nature. Humanism can thus be contrasted to the notion the God is the central entity in the cosmos, and to what might be called an â€œEasternâ€? view that God, humans, and nature are a unity.
We usually refer to ancient Greece, the Renaissance, and the periods from the Enlightenment through Modernism as humanist cultural periods. It is interesting to note that Frank Lloyd Wright, whose Organic approach parallels the Eastern notion of unity, specifically listed Greece (post and lintel), the Renaissance (â€œthe Renaissance was the setting of the sun that all of Europe mistook for dawnâ€?), and some European Modernism as architectures that he disliked. Thus we might say that Wright is not a humanist. Thus Post Humanism would be what ever is going on now after humanism, but of course that is a negative definition.
The humanism of the period from the Enlightenment through Modernism is closely associated with reason, which is to say with science and rationalism. The science that so successfully led to an understanding of and control over nature, was extended to humans as well in the â€œsocial sciences.â€?
A notion of human being that rejects these rationalist notions might be called post humanist.
Post humanist thinking is not fully formed, and has yet to distinguish itself from the long tradition of questioning of rationalism in Vico, Goethea, Jung, Joyce, Surrealism, Dada, etc., but that is another discussion.
Post human, the notion that we are on the verge of changing (or have actually changed) as a species and may soon be no longer human, itself has two variants.
The first holds that contemporary technologies, particularly electronic, digital, and communications technologies, have so altered our physical and cultural environments (and, from a McLuhanist point of view, our sense perception structures) that we are justified in positing a species change.
To sustain this argument, one would have to show that the impact of contemporary technologies is qualitatively different from that of others, such as agriculture and industrialization. It remains to be seen if that can be shown.
The second variant holds that the kinds of biotech and electronic technologies we have today are on the verge of actually changing us on the species level. Here we may well be on the verge of something totally new. Among the things we see underdevelopment are:
Genetic engineering, in which we can alter genes in sperm and ova before fertilization, affecting future individuals
Genetic therapy, in which we can introduce new genes into even adult individuals through virus vectors
Cloning, in which we can select a natural or altered human cell and coax it to develop into becoming a (post) human
Chip implantation, in which chips planted into the brain or nervous system can interact with the nervous system and communicate in unique ways with the environment
Extreme life extension, in which we can approach practical immortality
The development of artificial intelligence
Nanotechnology, which may make it possible to integrate tiny machines into our bodies.
Many of those addressing he implications of these issues are extreme technological optimists, and many of them are not literate about the human and cultural implications of these developments. Two terms in common in circles addressing these issues are transhumanism and singularity:
This is the term used by some for technologically enhanced humans. The term we coined by techies who do not know that you are supposed to coin terms prefixed by â€œpost.â€?
Here are some places where these issues are addressed.
From the Web Site: â€œWhat does â€˜posthumanâ€™ mean? It describes a sentient being that started out as a human or as a mind with a human way of thinking – and then by use of technology changes into someone who is no longer human. Such posthuman beings do not exist currently, therefore any more detailed description of what they would look like or how they would think and behave is pure speculation.â€?
An excerpt from:
How We Became Posthuman:Virtual Bodies in Cybernetics, Literature, and Informatics
by N. Katherine Hayles
From the Web Site: â€œYou are alone in the room, except for two computer terminals flickering in the dim light. You use the terminals to communicate with two entities in another room, whom you cannot see. Relying solely on their responses to your questions, you must decide which is the man, which the woman. Or, in another version of the famous â€œimitation gameâ€? proposed by Alan Turing in his classic 1950 paper â€˜Computer Machinery and Intelligence,â€™ you use the responses to decide which is the human, which the machine.1 One of the entities wants to help you guess correctly. His/her/its best strategy, Turing suggested, may be to answer your questions truthfully. The other entity wants to mislead you. He/she/it will try to reproduce through the words that appear on your terminal the characteristics of the other entity. Your job is to pose questions that can distinguish verbal performance from embodied reality. If you cannot tell the intelligent machine from the intelligent human, your failure proves, Turing argued, that machines can think.â€?
World Transhumanist Association
â€œTranshumanism is an interdisciplinary approach to understanding and evaluating the possibilities for overcoming biological limitations through technological progress. Transhumanists seek to expand technological opportunities for people to live longer and healthier lives and to enhance their intellectual, physical, and emotional capacities. The World Transhumanist Association is a nonprofit membership organization which works to promote discussion of the possibilities for radical improvement of human capacities using genetic, cybernetic and nano technologies.â€?
From the Web Site: â€œConnecting people to the future so that they can create it, we explore and advocate the use of science and technology for furthering human progressâ€¦. Betterhumans doesn’t just cover science and technology. Rather, we explore and advocate the use of science and technology for advancing humanity and continuing human progress. Our philosophy revolves around our goal of helping people understand, anticipate and create the future. We’re dedicated to having the best information, analysis and opinion on the impact of advancing science and technology.â€?
From the Web Site: â€œExtropy Institute sees enormous potential in advanced decision making and strategic thinking, coupled with emerging sciences and technologies, to quicken humanity’s resolution of these fundamental problems. We aim to gradually but firmly change the rules of the game called â€œbeing humanâ€?. In pursuit of our mission, Extropy Institute assembles individuals from diverse domains of expertise. We gather these ambitious, daring minds to combine creative and executive approaches to expose the fundamental roots of our problems. We see this advanced, multi-faceted solution-seeking as the best way to create a radically better future. We need not remain slaves to our cultural and evolutionary history. For centuries, cultures around the world saw human slavery as part of the natural order, until they were shown irrefutably otherwise. Likewise today, many of us passively accept or stridently defend the inevitability of human stupidity, malice, conflict, aging, and death. We invite you to participate in our mission to connect and cultivate the ingenious and intrepid shapers of the future.â€?
January 1st, 2005 | Filed under: Backgrounds | 1 Comment »
Quantum theory, dealing with the strange behavior of subatomic particles and the role of the observer, came into focus by the late 1920s in the Copenhagen Interpretation. Until recently, it remained in the domain of subatomic particles. In the early 1970s interest in Bellâ€™s Theorem of 1964 began to spread and in 1982 Alan Aspect produced experimental confirmation that observation of a particle can instantaneously influence a remote particle, a phenomena called â€œentanglement.â€? Even more disturbingly, we can photograph a particle here today, put the photo in a drawer, look at it six months from now, and influence a particle across the universe back at the time of the taking of the photograph. On a quantum level, neither space nor time exist as we have understood them.
Why is quantum theory important to architecture? Most directly, architecture exists in reality, and reality as we understand it today must at least in part be described by quantum theory. In addition, architecture is generated by the structures of consciousness of the people of its culture, and the structures of consciousness of people today in our quantum culture has changed from what it was just a few years ago.
There are numerous excellent general introductions to quantum theory. Here are three of them:
Thirty Years that Shook Physics: The Story of Quantum Theory
by George Gamow
An excellent, charming introduction and history by someone who was there. It was written in 1966, so it does not address recent issues, but the basics remain the basics.
Quantum Reality: Beyond the New Physics
by Nick Herbert
An excellent introduction that pauses to explain the science and mathematics behind each of the steps. Its main focus is on the implications of quantum theory for our understanding of reality. It also is a good introduction to Bellâ€™s Theorem.
Entanglement: The Unlikely Story of How Scientists, Mathematicians, and Philosophers Proved Einstein’s Spookiest Theory
by Amir D. Aczel
The focus of this book is Bellâ€™s Theorem and its origin in the EPR Paradox. Not only is it excellent on both, it also provides a comprehensive introduction to quantum theory and brings us up to date on the latest (2003) experimental confirmations of Bellâ€™s Theorem.
Key texts in quantum theory
There are three major mathematical ways to deal with quantum physics: matrices (Heisenberg), waves (SchrÃ¶dinger), and Hilbert Space (von Neumann.) There are numerous texts in quantum theory, but here are key ones for each of these three approaches. You can find discussions of each on Amazon.
Physical Principles of the Quantum Theory
by Werner Heisenberg
Collected Papers on Wave Mechanics
by Erwin SchrÃ¶dinger
Mathematical Foundations of Quantum Mechanics
John von Neumann
Although quantum theory was firmly established by the late 1920s and highly developed by the 1950s, many fields, including cosmology, could avoid quantum theory by claiming that it applied only on the micro scale, not the macro scale, and certainly not on the scale of galaxies. In the 1980s, Hawkingâ€™s work with black holes established quantum theory as fundamental to cosmology and more recently string theory and M-brane theory sees our universe as an undulating sheet, brushing against other universes.
Programming the Universe: A Quantum Computer Scientist Takes On the Cosmos
by Seth Lloyd
Understanding the universe as information processing.
The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory
by Brian Greene
Superstring theory as the means to bring together relativity and quantum theory. This book forms the basis of a recent Nova television series on superstring theory.
The Illustrated Brief History of Time
Updated and Expanded Edition
By Stephen William Hawking:
A Brief History of Time was one of the must successful examples of popular science writing ever, presenting the current state of thinking in cosmology, including the origins and ends of the universe, black holes, gravity, time travel, etc. There are now several editions, some illustrated. The illustrations are essential for the visually minded.
The Universe in a Nutshell
by Stephen Hawking
Basically an abridged Brief History of Time focusing on recent developments.
The Social Sciences
The social sciences model themselves on the physical sciences, but have only recently begun to absorb Maxwellâ€™s field theories. Recently some social scientists have begun to see human being and society as quantum phenomena.
Quantum Mind and Social Science
By Alexander Wendt
â€œThis book project explores the implications for social science of thinking about human beings and society as quantum mechanical phenomena. In the past there has been some very limited discussion of this question, but only as an intriguing analogy and thus it had essentially no impact. My suggestion is that man (sic) and society really are quantum phenomenaâ€¦.â€?
Stuart Hameroff has created a huge site presenting the notion that consciousness is a quantum phenomena. If you print out all of it, including its thirteen-part lecture series, you will have approximately three inches of paper.
â€œPerhaps the most perplexing problem in science, the nature of consciousness reflects on our very existence and relation to reality. Most approaches to the problem of consciousness see the brain as a computer, with neurons and synapses acting as switches or â€œbitsâ€?. In this view consciousness is thought to â€œemergeâ€? as a novel property of complex computation. However this approach fails to adequately deal with enigmatic features of consciousness and more radical approaches may be necessary…â€?
A quantum computer could theoretically be more powerful than would be the entire universe if every particle in it were a computer. David Deutsch contends that the only possible explanation for this is that quantum computers harness the power of their infinite siblings in infinite parallel universes.
â€œThe discovery that quantum physics allows fundamentally new modes of information processing has required the existing theories of computation, information and cryptography to be superseded by their quantum generalizations. The Centre for Quantum Computation conducts theoretical and experimental research into all aspects of quantum information processing, and into the implications of the quantum theory of computation for physics itself.â€?
Quest for the Quantum Computer
by Julian Brown
From Amazon: â€œJust how smart can computers get? Science journalist Julian Brown takes a hard look at the spooky world of quantum computation in Minds, Machines, and the Multiverseâ€”and his report is optimistic. Based in large part on the groundbreaking work of David Deutschâ€¦â€?
David Deutsch is a pioneer of quantum computing.
Quantum theory, along with information theory, relativity, DNA, materials sciences, nanotechnology, etc. brings us a radically new reality.
The Fabric of Reality: The Science of Parallel Universes-And Its Implications
by David Deutsch
This just might be the single most important book of the 21st Century (although it was published in 1997.) In it Deutsch, a pioneer in quantum computing at Oxford University, presents a fundamentally new view of reality that takes seriously four fundamental ideas of science that are fully accepted, but whose implications are widely ignored. These are: quantum theory, evolution, computation, and the theory of knowledge.
â€œItâ€™s Much A Much Bigger Thing Than It Looksâ€?
A Talk with David Deutsch on Edge.org
â€œHowever useful the theory [of quantum computation] as such is today and however spectacular the practical applications may be in the distant future, the really important thing is the philosophical implications â€” epistemological and metaphysical â€” and the implications for theoretical physics itself. One of the most important implications from my point of view is one that we get before we even build the first qubit [quantum bit]. The very structure of the theory already forces upon us a view of physical reality as a multiverse. Whether you call this the multiverse or ‘parallel universes’ or ‘parallel histories’, or ‘many histories’, or ‘many minds’ â€” there are now half a dozen or more variants of this idea â€” what the theory of quantum computation does is force us to revise our explanatory theories of the world, to recognize that it is a much bigger thing than it looks. I’m trying to say this in a way that is independent of ‘interpretation’: it’s a much bigger thing than it looks.â€?
â€œQuantum Constructor Theoryâ€?
A Talk with David Deutsch on Edge.org
â€œWe build computers and skyscrapers and space ships, and we clone animals, and so on. At root you can regard all of these too as computations, â€¦ a quantum constructor theory is needed.
EDGE: Why specifically a quantum constructor theory?
DEUTSCH: Because quantum theory is our basic theory of the physical world. All construction is quantum construction.â€?
A Talk with Neil Gershenfeld on Edge.org
â€œWe’ve already had a digital revolution; we don’t need to keep having it. The next big thing in computers will be literally outside the box, as we bring the programmability of the digital world to the rest of the world. With the benefit of hindsight, there’s a tremendous historical parallel between the transition from mainframes to PCs and now from machine tools to personal fabrication. By personal fabrication I mean not just making mechanical structures, but fully functioning systems including sensing, logic, actuation, and displays.â€?
January 1st, 2005 | Filed under: Backgrounds | No Comments »
A few years ago discussions of information theory usually started with Claude Shannon:
â€œClaude Elwood Shannon (April 30, 1916 – February 24, 2001) has been called â€œthe father of information theoryâ€?â€¦. He innovated the concept of implementing Boolean algebra with electronic relays and switches in his 1937 MIT master’s thesis, A Symbolic Analysis of Relay and Switching Circuits, and, with it, essentially founded practical digital circuit design. Professor Howard Gardner, of Harvard University, would praise it as â€œpossibly the most important, and also the most famous, master’s thesis of the centuryâ€?, and in 1940 the thesis earned its author the Alfred Nobel American Institute of American Engineers Award. After working in Cold Spring Harbor, under a geneticist, Shannon worked on his PhD in 1940 at MIT. His PhD Thesis is titled â€œAn Algebra for Theoretical Genetics.â€?â€¦ In 1948 he published A Mathematical Theory of Communication (ISBN 0252725484). This work focuses on the problem of how to regain at a target point the information a sender has transmitted. Shannon developed information entropy as a measure for redundancy. His later book with Warren Weaver, The Mathematical Theory of Communication, Univ of Illinois Press, is brief and surprisingly accessible to the non-specialistâ€¦.â€?
For the actual paper, â€œA Mathematical Theory of Communicationâ€? by Claude E. Shannon, go to:
The key ideas that Shannon introduces include:
1. Information can be measured
2. Information is related to entropy
3. The conveying of information includes a sources, a channel and a receiver
4. Noise is fundamental to the transmission of information
Today, one might more likely turn first to Wickipedia, and find that:
From Wikipedia: â€œInformation is a term with many meanings depending on context, but is as a rule closely related to such concepts as meaning, knowledge, instruction, communication, representation, and mental stimulus.
â€œAlthough many people speak of the advent of the â€˜information age,â€™ the â€˜information society,â€™ and information technologies, and even though information science and computer science are often in the spotlight, the word â€œinformationâ€? is often used without careful consideration of the various meanings it has come to acquire.
â€œThe following is a list of the most important meanings, roughly in order of narrowest to broadest.
1. Information as a message
2. Information as a pattern
3. Information as sensory input
4. Information as an influence which leads to a transformationâ€?
In other words, we now understand Information to cover a very wide range of concepts. Perhaps most important in understanding information is context. Suppose you wander into the wrong classroom and the instructor is talking about Hilbert Space and putting equations on the blackboard, and you know nothing about advanced mathematics. Your first thought might be that no information is conveyed. But actually that is not correct. You now know that the instructor is alive, that they speak English, and that you are probably in a math class. All of which is already a lot of information, and most of which has to do with context.
Getting Stated With Information Theory
So how do we get up to speed on information theory? I would work through the following:
1. Written in 1966, Singh’s Great Ideas in Information Theory, Language and Cybernetics is out of date, but still an excellent introduction to the field.
2. Next I would read chapter 2 of Brown’s The Quest for a Quantum Computer for recent developments in information theory. If you want to go deeper, read the entire book. It is one of the most important overviews of contemporary information theory you will find.
3. Then read David Deutschâ€™s Fabric of Reality. It is one of the most important books of our time.
3. Then I would play aroundâ€”perhaps re-read Douglas R. Hofstadter’s GÃ¶del, Escher, Bach. Robert Wright’s Three Scientists and Their Gods, presenting a lively introduction to the ideas of Edward Fredkin, Edward O. Wilson, and Kenneth Boulding. Wheeler’s Geons, Black Holes, and Quantum Foam. B. Roy Frieden’s Physics from Fisher Information (You can get the introduction on Amazon. The rest requires graduate math, so skip unless you can handle it. Also peruse the criticism of Frieden on the Web.). And of course Claude E. Shannon’s 1948 paper, which created the field of information theory.
The Universe As Information
Much of physics and cosmology now thinks of information as ranking with matter and energy as a fundamental property of the universe. With this ranking comes the notion that information can be transformed (including to and from matter and energy) but it cannot be destroyed. So what happens if you toss an encyclopedia into a black hole? Does the information in the encyclopedia get destroyed? In dealing with the question, Hawking and others have come up with the notion that the information content of a black hole is determined by its surface area, more specifically the number of units of Plank areas (the smallest possible area in our quantum granular world) on the surface of the black hole.
This leads to speculation that the universe is constituted of 2D membranes (i.e. information), and that our 3D world is a holographic projection from such a membrane.
If information is the fundamental constituent of reality, there might be implications for how we understand every aspect of reality, from physics to architecture.
Here are some approaches to the universe as information:
Programming the Universe: A Quantum Computer Scientist Takes On the Cosmos
by Seth Lloyd
Understanding the universe as information processing.
Fredkin was an early pioneer of digital physics. His main contributions include his work on reversible computation (which presents a new solution to that Maxwellâ€™s Demon paradox, and theoretically uses no energy) and cellular automata. Fredkin maintains that the universe is a computer. Not that the universe can be better understood through the metaphor of computing, but that it is a computer. You can read more on his web site:
â€œDigital Philosophy (DP) is a new way of thinking about the fundamental workings of processes in nature. DP is an atomic theory carried to a logical extreme where all quantities in nature are finite and discrete. This means that, theoretically, any quantity can be represented exactly by an integer. Further, DP implies that nature harbors no infinities, infinitesimals, continuities, or locally determined random variables. This paper explores Digital Philosophy by examining the consequences of these premisesâ€¦.â€?
Physics From Fisher Information
B. Roy Frieden was working enhancing satellite photos, when he began to wonder, what is the theoretical limit of the information that can be extracted from a fuzzy photo? That led him to Fisher Information, a branch of statistical theory, and then to the notion that all of physics could be redone from Fisher Information.
Physics from Fisher Information: A Unification
by B. Roy Frieden
From Amazon: â€œThis book defines and develops a unifying principle of physics, that of â€˜extreme physical information.â€™ Fisher information is a simple concept little known to physicists. The book develops statistical and physical properties of Fisher information. This information is a physical measure of disorder, sharing with entropy the property of monotonic change with time. The information concept is applied â€œphenomenallyâ€? to derive most known physics, from statistical mechanics and thermodynamics to quantum mechanics, the Einstein field equations, and quantum gravityâ€¦.â€?
Reviews of Physics from Fisher Information: A Unification by Frieden:
â€œâ€¦This is a compilation of Roy Frieden’s work in major physics journals over the last decade deriving the basic laws of physics – relativistic quantum mechanics, electromagnetism, gravitation, statistical thermodynamics – from a quantity (used by mathematical statisticians and by hardly anyone else) called Fisher Information. He derives the Klein-Gordon equation, Schroedinger wave equations, Maxwell’s equations, DeWitt-Wheeler law of quantum gravity, and various statistical thermodynamics lawsâ€¦.â€?
A negative review:http://cscs.umich.edu/~crshalizi/reviews/physics-from-fisher-info/
In brief, Stephen Wolfram contends that Newton made a mistake when he sought to understand the universe through mathematicsâ€”numbers. Instead, Wolfram contends that we should understand the universe through rules like those in computer programs, particularly cellular automata programs.
For more on Wolfram, go to: http://www.stephenwolfram.com/
A New Kind of Science
by Stephen Wolfram
From Amazon, from Library Journal: â€œGalileo proclaimed that nature is written in the language of mathematics, but Wolfram would argue that it is written in the language of programs and, remarkably, simple ones at that. A scientific prodigy who earned a doctorate from Caltech at age 20, Wolfram became a Nobel-caliber researcher in the emerging field of complexity shortly thereafter only to abscond from academe and establish his own software company (which published this book). In secrecy, for over ten years, he experimented with computer graphics called cellular automata, which produce shaded images on grid patterns according to programmatic rules (973 images are reproduced here). Wolfram went on to discover that the same vastly complex images could be produced by even very simple sets of rules and argues here that dynamic and complex systems throughout nature are triggered by simple programs. Mathematical science can describe and in some cases predict phenomena but cannot truly explain why what happens happens. Underscoring his point that simplicity begets complexity, Wolfram wrote this book in mostly nontechnical language. Any informed, motivated reader can, with some effort, follow from chapter to chapter, but the work as a whole and its implications are probably understood fully by the author alone. Had this been written by a lesser scientist, many academics might have dismissed it as the work of a crank. Given its source, though, it will merit discussion for years to come. Essential for all academic libraries. [This tome is a surprise best seller on Amazon. Ed.] Gregg Sapp, Science Lib., SUNY at Alban. – Gregg Sapp, Science Lib., SUNY at Albany. Copyright 2002 Cahners Business Information, Inc.â€?
Some negative comments on Wolframâ€™s book have surfaced, claiming that much of what he says is not new, and that he does not adequately credit others. The bottom line is that you should read the book. It is a fantastic education in all of contemporary science, physics, computation, and information theory, and it introduces numerous new ideas.
More on Cellular Automata
â€œCellular automata are discrete dynamical systems whose behaviour is completely specified in terms of a local relation. A cellular automaton can be thought of as a stylised universe. Space is represented by a uniform grid, with each cell containing a few bits of data; time advances in discrete steps and the laws of the â€˜universeâ€™ are expressed in, say, a small look-up table, through which at each step each cell computes its new state from that of its close neighbours. Thus, the system’s laws are local and uniformâ€¦. The first cellular automaton was conceived by Von Neumann in the late fortiesâ€¦.â€?
Black Holes and Information
â€œIn 1997, the three cosmologists made a famous bet as to whether information that enters a black hole ceases to exist — that is, whether the interior of a black hole is changed at all by the characteristics of particles that enter it. Hawkingâ€™s research suggested that the particles have no effect whatsoever. But his theory violated the laws of quantum mechanics and created a contradiction known as the â€˜information paradox.â€™â€?
See Stephen Hawkingâ€™s web site at http://www.hawking.org.uk/home/hindex.html
For the latest, see:
NewScientist.com, July 14, 2004
â€œAfter nearly 30 years of arguing that a black hole destroys everything that falls into it, Stephen Hawking is saying he was wrong. It seems that black holes may after all allow information within them to escape.
It might solve one of the long-standing puzzles in modern physics, known as the black hole information paradox. In 1976, he calculated that once a black hole forms, it starts losing mass by radiating energy. This â€˜Hawking radiationâ€™ contains no information about the matter inside the black hole and once the black hole evaporates, all information is lost.
But this conflicts with the laws of quantum physics, which say that such information can never be completely wiped out. Hawking’s argument was that the intense gravitational fields of black holes somehow unravel the laws of quantum physicsâ€¦.â€?
For full article:
The Truth Is Still Out There
In an op-ed piece in The New York Times on August 3, 2004, Paul Ginsparg, professor of physics and information science at Cornell University, describing the background of the issues:
â€œâ€¦ Near the end of a small meeting I attended in 1993, the question of â€˜What happens to information that falls into a black hole?â€™ arose, and a democratic method was chosen to address it. The vote proceeded more or less along party lines, with the general relativists firm in their adherence to causality, and the quantum field theorists equally adamant in their faith in unitarity. Of the 77 participants, 25 voted for the category â€˜It’s lost;â€™ and 39, a slight majority, voted for â€˜It comes out,’â€™ (that it re-emerges). Seven voted that the black hole would not evaporate entirely, and the remaining six voted for an unspecified â€˜Something else.â€™ â€¦â€?