A New Kind of Science
Encyclopedia
A New Kind of Science is a book by Stephen Wolfram
Stephen Wolfram
Stephen Wolfram is a British scientist and the chief designer of the Mathematica software application and the Wolfram Alpha computational knowledge engine.- Biography :...

, published in 2002. It contains an empirical and systematic study of computational systems such as cellular automata. Wolfram calls these systems simple programs and argues that the scientific philosophy and methods appropriate for the study of simple programs are relevant to other fields of science.

Computation and its implications

The thesis of A New Kind of Science is twofold: that the nature of computation must be explored experimentally, and that the results of these experiments have great relevance to understanding the natural world, which is assumed to be digital
Digital philosophy
Digital philosophy is a direction in philosophy and cosmology advocated by certain mathematicians and theoretical physicists, e.g., Gregory Chaitin, Edward Fredkin, Stephen Wolfram, and Konrad Zuse ....

. Since its crystallization in the 1930s, computation has been primarily approached from two traditions: engineering
Engineering
Engineering is the discipline, art, skill and profession of acquiring and applying scientific, mathematical, economic, social, and practical knowledge, in order to design and build structures, machines, devices, systems, materials and processes that safely realize improvements to the lives of...

, which seeks to build practical systems using computations; and mathematics
Mathematics
Mathematics is the study of quantity, space, structure, and change. Mathematicians seek out patterns and formulate new conjectures. Mathematicians resolve the truth or falsity of conjectures by mathematical proofs, which are arguments sufficient to convince other mathematicians of their validity...

, which seeks to prove theorems about computation (albeit already in the 1970s computing as a discipline was described as being at the intersection of mathematical, engineering, and empirical/scientific traditions).

Wolfram describes himself as introducing a third major tradition, which is the systematic, empirical investigation of computational systems for their own sake. This is where the "New" and "Science" parts of the book's title originate. However, in proceeding with a scientific investigation of computational systems, Wolfram eventually came to the conclusion that an entirely new method is needed. In his view, traditional mathematics was failing to describe the complexity
Complexity
In general usage, complexity tends to be used to characterize something with many parts in intricate arrangement. The study of these complex linkages is the main goal of complex systems theory. In science there are at this time a number of approaches to characterizing complexity, many of which are...

 seen in these systems meaningfully. Through a combination of experiment and theoretical positioning, the book introduces a method that Wolfram argues is the most realistic way to make scientific progress with computational systems, casting A New Kind of Science as a "kind" of science, and allows its principles to be potentially applicable in a wide range of fields.

Simple programs

The basic subject of Wolfram's "new kind of science" is the study of simple abstract rules—essentially, elementary computer program
Computer program
A computer program is a sequence of instructions written to perform a specified task with a computer. A computer requires programs to function, typically executing the program's instructions in a central processor. The program has an executable form that the computer can use directly to execute...

s. In almost any class of computational system, one very quickly finds instances of great complexity among its simplest cases. This seems to be true regardless of the components of the system and the details of its setup. Systems explored in the book include cellular automata in one, two, and three dimensions; mobile automata
Mobile automaton
Mobile automaton is a class of automata similar to cellular automata but which have a single "active" cell instead of updating all cells in parallel. In a mobile automaton, the evolution rules apply only to the active cell, and also specify how the active cell moves from one generation to the next...

; Turing machine
Turing machine
A Turing machine is a theoretical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a...

s in 1 and 2 dimensions; several varieties of substitution and network systems; primitive recursive functions; nested recursive functions
Recursion (computer science)
Recursion in computer science is a method where the solution to a problem depends on solutions to smaller instances of the same problem. The approach can be applied to many types of problems, and is one of the central ideas of computer science....

; combinators; tag system
Tag system
A tag system is a deterministic computational model published by Emil Leon Post in 1943 as a simple form of Post canonical system. A tag system may also be viewed as an abstract machine, called a Post tag machine —briefly, a finite state machine whose only tape is a FIFO queue of unbounded length,...

s; register machine
Register machine
In mathematical logic and theoretical computer science a register machine is a generic class of abstract machines used in a manner similar to a Turing machine...

s; reversal-addition
Palindromic number
A palindromic number or numeral palindrome is a 'symmetrical' number like 16461, that remains the same when its digits are reversed. The term palindromic is derived from palindrome, which refers to a word like rotor that remains unchanged under reversal of its letters...

; and a number of other systems. For a program to qualify as simple, there are several benchmarks:
  1. Its operation can be completely explained by a simple graphical illustration.
  2. It can be completely explained in a few sentences of human language
    Human language
    A human language is a language primarily intended for communication among humans. The two major categories of human languages are natural languages and constructed languages...

    .
  3. It can be implemented in a computer language using just a few lines of code.
  4. The number of its possible variations is small enough so that all of them can be computed.


Generally, simple programs tend to have a very simple abstract framework. Simple cellular automata, Turing machines, and combinators are examples of such frameworks, while more complex cellular automata do not necessarily qualify as simple programs. It is also possible to invent new frameworks, particularly to capture the operation of natural systems. The remarkable feature of simple programs is that a significant percentage of them are capable of producing great complexity. Simply enumerating all possible variations of almost any class of programs quickly leads one to examples that do unexpected and interesting things. This leads to the question: if the program is so simple, where does the complexity come from? In a sense, there is not enough room in the program's definition to directly encode all the things the program can do. Therefore, simple programs can be seen as a minimal example of emergence
Emergence
In philosophy, systems theory, science, and art, emergence is the way complex systems and patterns arise out of a multiplicity of relatively simple interactions. Emergence is central to the theories of integrative levels and of complex systems....

. A logical deduction from this phenomenon is that if the details of the program's rules have little direct relationship to its behavior, then it is very difficult to directly engineer a simple program to perform a specific behavior. An alternative approach is to try to engineer a simple overall computational framework, and then do a brute-force search
Brute-force search
In computer science, brute-force search or exhaustive search, also known as generate and test, is a trivial but very general problem-solving technique that consists of systematically enumerating all possible candidates for the solution and checking whether each candidate satisfies the problem's...

 through all of the possible components for the best match.

Simple programs are capable of a remarkable range of behavior. Some have been proven to be universal computers. Others exhibit properties familiar from traditional science, such as thermodynamic
Thermodynamics
Thermodynamics is a physical science that studies the effects on material bodies, and on radiation in regions of space, of transfer of heat and of work done on or by the bodies or radiation...

 behavior, continuum
Continuum mechanics
Continuum mechanics is a branch of mechanics that deals with the analysis of the kinematics and the mechanical behavior of materials modelled as a continuous mass rather than as discrete particles...

 behavior, conserved quantities, percolation
Percolation
In physics, chemistry and materials science, percolation concerns the movement and filtering of fluids through porous materials...

, sensitive dependence on initial conditions, and others. They have been used as models of traffic
Traffic
Traffic on roads may consist of pedestrians, ridden or herded animals, vehicles, streetcars and other conveyances, either singly or together, while using the public way for purposes of travel...

, material fracture, crystal growth
Crystal growth
A crystal is a solid material whose constituent atoms, molecules, or ions are arranged in an orderly repeating pattern extending in all three spatial dimensions. Crystal growth is a major stage of a crystallization process, and consists in the addition of new atoms, ions, or polymer strings into...

, biological growth, and various sociological
Sociology
Sociology is the study of society. It is a social science—a term with which it is sometimes synonymous—which uses various methods of empirical investigation and critical analysis to develop a body of knowledge about human social activity...

, geological
Geology
Geology is the science comprising the study of solid Earth, the rocks of which it is composed, and the processes by which it evolves. Geology gives insight into the history of the Earth, as it provides the primary evidence for plate tectonics, the evolutionary history of life, and past climates...

, and ecological
Ecology
Ecology is the scientific study of the relations that living organisms have with respect to each other and their natural environment. Variables of interest to ecologists include the composition, distribution, amount , number, and changing states of organisms within and among ecosystems...

 phenomena. Another feature of simple programs is that making them more complicated seems to have little effect on their overall complexity
Complexity
In general usage, complexity tends to be used to characterize something with many parts in intricate arrangement. The study of these complex linkages is the main goal of complex systems theory. In science there are at this time a number of approaches to characterizing complexity, many of which are...

. A New Kind of Science argues that this is evidence that simple programs are enough to capture the essence of almost any complex system
Complex system
A complex system is a system composed of interconnected parts that as a whole exhibit one or more properties not obvious from the properties of the individual parts....

.

Mapping and mining the computational universe

In order to study simple rules and their often complex behaviour, Wolfram believes it is necessary to systematically explore all of these computational systems and document what they do. He believes this study should become a new branch of science, like physics
Physics
Physics is a natural science that involves the study of matter and its motion through spacetime, along with related concepts such as energy and force. More broadly, it is the general analysis of nature, conducted in order to understand how the universe behaves.Physics is one of the oldest academic...

 or chemistry
Chemistry
Chemistry is the science of matter, especially its chemical reactions, but also its composition, structure and properties. Chemistry is concerned with atoms and their interactions with other atoms, and particularly with the properties of chemical bonds....

. The basic goal of this field is to understand and characterize the computational universe using experimental methods.

The proposed new branch of scientific exploration admits many different forms of scientific production. For instance, qualitative classifications like those found in biology
Biology
Biology is a natural science concerned with the study of life and living organisms, including their structure, function, growth, origin, evolution, distribution, and taxonomy. Biology is a vast subject containing many subdivisions, topics, and disciplines...

 are often the results of initial forays into the computational jungle. On the other hand, explicit proofs that certain systems compute this or that function are also admissible. There are also some forms of production that are in some ways unique to this field of study. For instance, the discovery of computational mechanisms that emerge in different systems but in bizarrely different forms.

Another kind of production involves the creation of programs for the analysis of computational systems—for in the NKS framework, these themselves should be simple programs, and subject to the same goals and methodology. An extension of this idea is that the human mind is itself a computational system, and hence providing it with raw data in as effective a way as possible is crucial to research. Wolfram believes that programs and their analysis should be visualized as directly as possible, and exhaustively examined by the thousands or more. Since this new field concerns abstract rules, it can in principle address issues relevant to other fields of science. However, in general Wolfram's idea is that novel ideas and mechanisms can be discovered in the computational universe—where they can be witnessed in their clearest forms—and then other fields can pick and choose among these discoveries for those they find relevant.

Systematic abstract science

While Wolfram promotes simple programs as a scientific discipline, he also insists that its methodology will revolutionize essentially every field of science. The basis for his claim is that the study of simple programs is the most minimal possible form of science, which is equally grounded in both abstraction
Abstraction
Abstraction is a process by which higher concepts are derived from the usage and classification of literal concepts, first principles, or other methods....

 and empirical experimentation. Every aspect of the methodology advocated in NKS is optimized to make experimentation as direct, easy, and meaningful as possible—while maximizing the chances that the experiment will do something unexpected. Just as NKS allows computational mechanisms to be studied in their cleanest forms, Wolfram believes the process of doing NKS captures the essence of the process of doing science—and allows that process's strengths and shortcomings to be directly revealed.

Wolfram believes that the computational realities of the universe make science hard for fundamental reasons. But he also argues that by understanding the importance of these realities, we can learn to leverage them in our favor. For instance, instead of reverse engineering
Reverse engineering
Reverse engineering is the process of discovering the technological principles of a device, object, or system through analysis of its structure, function, and operation...

 our theories from observation, we can simply enumerate systems and then try to match them to the behaviors we observe. A major theme of NKS style research is investigating the structure of the possibility space. Wolfram feels that science is far too ad hoc, in part because the models used are too complicated and/or unnecessarily organized around the limited primitives of traditional mathematics. Wolfram advocates using models whose variations are enumerable and whose consequences are straightforward to compute and analyze.

Philosophical underpinnings

Wolfram believes that one of his achievements is not just exclaiming, "computation is important!", but in providing a coherent system of ideas that justifies computation as an organizing principle of science
Philosophy of science
The philosophy of science is concerned with the assumptions, foundations, methods and implications of science. It is also concerned with the use and merit of science and sometimes overlaps metaphysics and epistemology by exploring whether scientific results are actually a study of truth...

. For instance, Wolfram's concept of computational irreducibility
Computational irreducibility
Computational irreducibility is one of the main ideas proposed by Stephen Wolfram in his book A New Kind of Science.-The idea:Wolfram terms the inability to shortcut a program , or otherwise describe its behavior in a simple way, "computational irreducibility"...

—that some complex computations cannot be short-cutted or "reduced", is ultimately the reason why computational models of nature must be considered, in addition to traditional mathematical models. Likewise, his idea of intrinsic randomness generation—that natural systems can generate their own randomness, rather than using chaos theory or stochastic perturbations—implies that explicit computational models may in some cases provide more accurate and rich models of random-looking systems.

Based on his experimental results, Wolfram has developed the Principle of Computational Equivalence (see below), which asserts that almost all processes that are not obviously simple are of equivalent sophistication. From this seemingly vague single principle Wolfram draws a broad array of concrete deductions that reinforce many aspects of his theory. Possibly the most important among these is an explanation as to why we experience randomness
Randomness
Randomness has somewhat differing meanings as used in various fields. It also has common meanings which are connected to the notion of predictability of events....

 and complexity
Complexity
In general usage, complexity tends to be used to characterize something with many parts in intricate arrangement. The study of these complex linkages is the main goal of complex systems theory. In science there are at this time a number of approaches to characterizing complexity, many of which are...

: often, the systems we analyze are just as sophisticated as we are. Thus, complexity is not a special quality of systems, like for instance the concept of "heat", but simply a label for all systems whose computations are sophisticated. Understanding this makes the "normal science" of the NKS paradigm possible.

At the deepest level, Wolfram believes that like many of the most important scientific ideas, the Principle allows science to be more general by pointing out new ways in which humans are not special. In recent times, it has been thought that the complexity of human intelligence makes us special—but the Principle asserts otherwise. In a sense, many of Wolfram's ideas are based on understanding the scientific process—including the human mind—as operating within the same universe it studies, rather than somehow being outside it.

Principle of computational equivalence

The principle states that systems found in the natural world
Natural World
Natural World is the longest-running nature documentary series on British television. 2008 marked the series' 25th anniversary under its present title, though its origins can be traced back to its predecessor The World About Us which began over 40 years ago...

 can perform computation
Computation
Computation is defined as any type of calculation. Also defined as use of computer technology in Information processing.Computation is a process following a well-defined model understood and expressed in an algorithm, protocol, network topology, etc...

s up to a maximal ("universal") level of computational power. Most systems can attain this level. Systems, in principle, compute the same things as a computer. Computation is therefore simply a question of translating input and outputs
Input/output
In computing, input/output, or I/O, refers to the communication between an information processing system , and the outside world, possibly a human, or another information processing system. Inputs are the signals or data received by the system, and outputs are the signals or data sent from it...

 from one system to another. Consequently, most systems are computationally equivalent. Proposed examples of such systems are the workings of the human brain and the evolution of weather systems.

Applications and results

There are a vast number of specific results and ideas in the NKS book, and they can be organized into several themes. One common theme of examples and applications is demonstrating how little it takes to achieve interesting behavior, and how the proper methodology can discover these cases.

First, there are perhaps several dozen cases where the NKS book introduces the simplest known system in some class that has a particular characteristic. Some examples include the first primitive recursive function that results in complexity, the smallest universal Turing Machine
Turing machine
A Turing machine is a theoretical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a...

, and the shortest axiom
Axiom
In traditional logic, an axiom or postulate is a proposition that is not proven or demonstrated but considered either to be self-evident or to define and delimit the realm of analysis. In other words, an axiom is a logical statement that is assumed to be true...

 for propositional calculus
Propositional calculus
In mathematical logic, a propositional calculus or logic is a formal system in which formulas of a formal language may be interpreted as representing propositions. A system of inference rules and axioms allows certain formulas to be derived, called theorems; which may be interpreted as true...

. In a similar vein, Wolfram also demonstrates a large number of minimal examples of how simple programs exhibit phenomena like phase transition
Phase transition
A phase transition is the transformation of a thermodynamic system from one phase or state of matter to another.A phase of a thermodynamic system and the states of matter have uniform physical properties....

s, conserved quantities
Conserved quantity
In mathematics, a conserved quantity of a dynamical system is a function H of the dependent variables that is a constant along each trajectory of the system. A conserved quantity can be a useful tool for qualitative analysis...

 and continuum behavior and thermodynamics
Thermodynamics
Thermodynamics is a physical science that studies the effects on material bodies, and on radiation in regions of space, of transfer of heat and of work done on or by the bodies or radiation...

 that are familiar from traditional science. Simple computational model
Computational model
A computational model is a mathematical model in computational science that requires extensive computational resources to study the behavior of a complex system by computer simulation. The system under study is often a complex nonlinear system for which simple, intuitive analytical solutions are...

s of natural systems like shell growth, fluid turbulence, and phyllotaxis
Phyllotaxis
In botany, phyllotaxis or phyllotaxy is the arrangement of leaves on a plant stem .- Pattern structure :...

 are a final category of applications that fall in this theme.

Another common theme is taking facts about the computational universe as a whole and using them to reason about fields in a holistic way. For instance, Wolfram discusses how facts about the computational universe inform evolutionary theory, SETI
SETI
The search for extraterrestrial intelligence is the collective name for a number of activities people undertake to search for intelligent extraterrestrial life. Some of the most well known projects are run by the SETI Institute. SETI projects use scientific methods to search for intelligent life...

, free will
Free will
"To make my own decisions whether I am successful or not due to uncontrollable forces" -Troy MorrisonA pragmatic definition of free willFree will is the ability of agents to make choices free from certain kinds of constraints. The existence of free will and its exact nature and definition have long...

, computational complexity theory
Computational complexity theory
Computational complexity theory is a branch of the theory of computation in theoretical computer science and mathematics that focuses on classifying computational problems according to their inherent difficulty, and relating those classes to each other...

, and philosophical fields like ontology
Ontology
Ontology is the philosophical study of the nature of being, existence or reality as such, as well as the basic categories of being and their relations...

, epistemology, and even postmodernism
Postmodernism
Postmodernism is a philosophical movement evolved in reaction to modernism, the tendency in contemporary culture to accept only objective truth and to be inherently suspicious towards a global cultural narrative or meta-narrative. Postmodernist thought is an intentional departure from the...

.

Wolfram suggests that the theory of computational irreducibility
Computational irreducibility
Computational irreducibility is one of the main ideas proposed by Stephen Wolfram in his book A New Kind of Science.-The idea:Wolfram terms the inability to shortcut a program , or otherwise describe its behavior in a simple way, "computational irreducibility"...

 may provide a resolution to the existence of free will in a nominally deterministic universe. He posits that the computational process in the brain
Brain
The brain is the center of the nervous system in all vertebrate and most invertebrate animals—only a few primitive invertebrates such as sponges, jellyfish, sea squirts and starfishes do not have one. It is located in the head, usually close to primary sensory apparatus such as vision, hearing,...

 of the being with free will is actually complex
Complexity
In general usage, complexity tends to be used to characterize something with many parts in intricate arrangement. The study of these complex linkages is the main goal of complex systems theory. In science there are at this time a number of approaches to characterizing complexity, many of which are...

 enough so that it cannot be captured in a simpler computation, due to the principle of computational irreducibility. Thus while the process is indeed deterministic, there is no better way to determine the being's will than to essentially run the experiment and let the being exercise it.

The book also contains a vast number of individual results—both experimental and analytic—about what a particular automaton computes, or what its characteristics are, using some methods of analysis.

One specific new technical result in the book is a description of the Turing completeness
Turing completeness
In computability theory, a system of data-manipulation rules is said to be Turing complete or computationally universal if and only if it can be used to simulate any single-taped Turing machine and thus in principle any computer. A classic example is the lambda calculus...

 of the Rule 110 cellular automaton. Rule 110 can be simulated by very small Turing machines, and such a 2-state 5-symbol universal Turing machine
Universal Turing machine
In computer science, a universal Turing machine is a Turing machine that can simulate an arbitrary Turing machine on arbitrary input. The universal machine essentially achieves this by reading both the description of the machine to be simulated as well as the input thereof from its own tape. Alan...

 is given. Wolfram also conjectures that a particular 2-state 3-symbol Turing machine
Wolfram's 2-state 3-symbol Turing machine
In his book A New Kind of Science, Stephen Wolfram described a universal 2-state 5-color Turing machine, and conjectured that a particular 2-state 3-color Turing machine might be universal as well....

 is universal. In 2007, as part of commemorating the fifth anniversary of the book, a $25,000 prize was offered for a proof of the (2, 3) machine's universality.

NKS Summer School

Every year, Wolfram and his group of instructors organizes a summer school. The first four summer schools from 2003 to 2006 were held at Brown university. Later the summer school was hosted by the university of Vermont at Burlington with the exception of the year 2009 that was held at the Istituto di Scienza e Tecnologie dell’Informazione of the CNR in Pisa, Italy. After seven consecutive summer schools more than 200 people have participated, some of which continued developing their 3-week research projects as their Master's or Ph.D thesis. Some of the research done in the summer school has yielded important published results.

Reception

A New Kind of Science received extensive media publicity for a scientific book, generating scores of articles in such publications as The New York Times
The New York Times
The New York Times is an American daily newspaper founded and continuously published in New York City since 1851. The New York Times has won 106 Pulitzer Prizes, the most of any news organization...

, Newsweek
Newsweek
Newsweek is an American weekly news magazine published in New York City. It is distributed throughout the United States and internationally. It is the second-largest news weekly magazine in the U.S., having trailed Time in circulation and advertising revenue for most of its existence...

, Wired, and The Economist
The Economist
The Economist is an English-language weekly news and international affairs publication owned by The Economist Newspaper Ltd. and edited in offices in the City of Westminster, London, England. Continuous publication began under founder James Wilson in September 1843...

. It was a best-seller and won numerous awards. NKS was reviewed in a large range of scientific journals, and several themes emerged in these reviews. Many reviewers enjoyed the quality of the book's production and the clear way Wolfram presented many ideas. Even those reviewers who engaged in other criticisms found aspects of the book to be interesting and thought-provoking. On the other hand, many reviewers criticized Wolfram for his lack of modesty, poor editing, lack of mathematical rigor, and the lack of immediate utility of his ideas. Concerning the ultimate importance of the book, a common attitude was that of either skepticism or "wait and see". Many reviewers and the media focused on the use of simple programs (cellular automata in particular) to model nature, rather than the more fundamental idea of systematically exploring the universe of simple programs.

Scientific philosophy

A key tenet of NKS is that the simpler the system, the more likely a version of it will recur in a wide variety of more complicated contexts. Therefore, NKS argues that systematically exploring the space of simple programs will lead to a base of reusable knowledge. However, many scientists believe that of all possible parameters, only some actually occur in the universe; that, for instance, of all possible variations of an equation, most will be essentially meaningless. NKS has also been criticized for asserting that the behavior of simple systems is somehow representative of all systems.

Methodology

A common criticism of NKS is that it does not follow established scientific methodology. NKS does not establish rigorous mathematical definitions, nor does it attempt to prove theorem
Theorem
In mathematics, a theorem is a statement that has been proven on the basis of previously established statements, such as other theorems, and previously accepted statements, such as axioms...

s. Along these lines, NKS has also been criticized for being heavily visual, with much information conveyed by pictures that do not have formal meaning. It has also been criticized for not using modern research in the field of complexity
Complexity
In general usage, complexity tends to be used to characterize something with many parts in intricate arrangement. The study of these complex linkages is the main goal of complex systems theory. In science there are at this time a number of approaches to characterizing complexity, many of which are...

, particularly the works that have studied complexity from a rigorous mathematical perspective.

Critics also note that none of the book's contents were published in peer-reviewed journals, the standard method for distributing new results, and complain it insufficiently credited other scientists whose work it is built on. Wolfram relegates all discussion of other people to his lengthy endnotes and thus no one is directly credited in the text. His critics argue that even the endnotes are misleading, glossing over many relevant discoveries and thus making Wolfram's work seem more novel.

Utility

NKS has been criticized for not providing specific results that would be immediately applicable to ongoing scientific research. There has also been criticism, implicit and explicit, that the study of simple programs has little connection to the physical universe, and hence is of limited value. Steven Weinberg
Steven Weinberg
Steven Weinberg is an American theoretical physicist and Nobel laureate in Physics for his contributions with Abdus Salam and Sheldon Glashow to the unification of the weak force and electromagnetic interaction between elementary particles....

 has pointed out that no real world system has been explained using Wolfram's methods in a satisfactory fashion.

Principle of computational equivalence

The PCE has been criticized for being vague, unmathematical, and for not making directly verifiable predictions; however, Wolfram's group has described the principle as such, not a law, theorem or formula. It has also been criticized for being contrary to the spirit of research in mathematical logic and computational complexity theory, which seek to make fine-grained distinctions between levels of computational sophistication. Others suggest it is little more than a rechristening of the Church-Turing thesis. However, the Church-Turing thesis imposes an upper limit while Wolfram's PCE suggests the nonexistence of intermediate degrees of computation sending a computational system either to the upper level (universal) or to the lowest degree, explained by Klaus Sutner in terms of physics-like computation as a zero-one law
Zero-one law
In probability theory, a zero-one law is a result that states that an event must have probability 0 or 1 and no intermediate value. Sometimes, the statement is that the limit of certain probabilities must be 0 or 1.It may refer to:...

 claiming that, in practice, constructing actual computers with intermediate degrees is highly artificial and hasn't ever been done, hence endorsing Wolfram's intuition captured in his PCE.

The fundamental theory (NKS Chapter 9)

Wolfram's speculations of a direction towards a fundamental theory of physics have been criticized as vague and obsolete. Scott Aaronson
Scott Aaronson
Scott Joel Aaronson is a theoretical computer scientist and faculty member in the Electrical Engineering and Computer Science department at the Massachusetts Institute of Technology.-Education:...

, Assistant Professor of Electrical Engineering and Computer Science at MIT, also claims that Wolfram's methods cannot be compatible with both special relativity
Special relativity
Special relativity is the physical theory of measurement in an inertial frame of reference proposed in 1905 by Albert Einstein in the paper "On the Electrodynamics of Moving Bodies".It generalizes Galileo's...

 and Bell's theorem
Bell's theorem
In theoretical physics, Bell's theorem is a no-go theorem, loosely stating that:The theorem has great importance for physics and the philosophy of science, as it implies that quantum physics must necessarily violate either the principle of locality or counterfactual definiteness...

 violations, which conflicts with the observed results of Bell test experiments
Bell test experiments
The Bell test experiments serve to investigate the validity of the entanglement effect in quantum mechanics by using some kind of Bell inequality...

.

In a 2002 review of NKS, the Nobel laureate and elementary particle physicist Steven Weinberg
Steven Weinberg
Steven Weinberg is an American theoretical physicist and Nobel laureate in Physics for his contributions with Abdus Salam and Sheldon Glashow to the unification of the weak force and electromagnetic interaction between elementary particles....

 wrote, "Wolfram himself is a lapsed elementary particle physicist, and I suppose he can't resist trying to apply his experience with digital computer programs to the laws of nature. This has led him to the view (also considered in a 1981 paper by Richard Feynman) that nature is discrete rather than continuous. He suggests that space consists of a set of isolated points, like cells in a cellular automaton, and that even time flows in discrete steps. Following an idea of Edward Fredkin, he concludes that the universe itself would then be an automaton, like a giant computer. It's possible, but I can't see any motivation for these speculations, except that this is the sort of system that Wolfram and others have become used to in their work on computers. So might a carpenter, looking at the moon, suppose that it is made of wood."

According to NKS Chapter 9, special relativity theory and quantum field theory are merely approximations to a digital network with inaccessible signal propagation below the Planck scale. NKS Chapter 9 and M-theory
M-theory
In theoretical physics, M-theory is an extension of string theory in which 11 dimensions are identified. Because the dimensionality exceeds that of superstring theories in 10 dimensions, proponents believe that the 11-dimensional theory unites all five string theories...

 both attempt to unify general relativity theory and quantum field theory. M-theory postulates that there is a minimum physical wavelength and that vibrating string-like entities can model all of physics. NKS Chapter 9 postulates that there is a finite automaton that builds time, space, and energy from informational substrate below the Planck scale
Planck scale
In particle physics and physical cosmology, the Planck scale is an energy scale around 1.22 × 1019 GeV at which quantum effects of gravity become strong...

. According to Wolfram, infinities and infinitesimals do not occur in nature, except perhaps for time as a potential infinity. In particular, there is a maximum physical wavelength in addition to the minimum physical wavelength postulated by M-theory.

In the NKS theory, the basic physical realities of time, space, and energy are merely approximations that arise from a few simple rules that operate with hidden determinism below the Planck scale. According to Wolfram, "building on the discovery that even simple programs can yield highly complex behavior, A New Kind of Science shows that with appropriate kinds of rules, simple programs can give rise to behavior that reproduces a remarkable range of known features of our universe — leading to the bold assertion that there could be a simple short program that represents a truly fundamental model of the universe, and which if run for long enough would reproduce the behavior of our world in every detail.”

Natural selection

Wolfram's claim that natural selection
Natural selection
Natural selection is the nonrandom process by which biologic traits become either more or less common in a population as a function of differential reproduction of their bearers. It is a key mechanism of evolution....

 is not the fundamental cause of complexity in biology has led some to state that Wolfram does not understand the theory of evolution. However, some experts have acknowledged that natural selection leaves many unanswered questions, which information theory might be able to explain. In this context, Wolfram's work is similar to that of D'Arcy Thompson. D'Arcy Thompson's work, however, is mathematical in nature, while Wolfram's is rule-based (computational). Whereas D'Arcy Thompson showed that nature made certain mathematical choices, without having specified the actual process involved, Wolfram's work suggests that Nature makes these mathematical choices because it is mining what he calls the computational universe from where it picks a computer program.

Originality and self-image

NKS has been heavily criticized as not being original or important enough to justify its title and claims, mostly by people who argue that the book is about simple systems generating complex behavior. However, even though the fact that simple systems are capable of complicated behavior is an important part of the book, the main contribution is the new methodology of mining the computational universe. Edward Fredkin
Edward Fredkin
Edward Fredkin is an early pioneer of digital physics. In recent work, he uses the term digital philosophy . His primary contributions include his work on reversible computing and cellular automata...

 and Konrad Zuse
Konrad Zuse
Konrad Zuse was a German civil engineer and computer pioneer. His greatest achievement was the world's first functional program-controlled Turing-complete computer, the Z3, which became operational in May 1941....

 pioneered the idea of a computable universe, the former by writing a line in his book on how the world might be like a cellular automaton, and later further developed by Fredkin using a toy model called Salt. It has been claimed that NKS tries to take these ideas as its own. This has been mainly suggested by people thinking that Wolfram's main thesis is that the universe is a cellular automaton in spite of the fact that Wolfram's proposal as a discrete model of the universe is a trivalent network. Wolfram himself considers that a cellular automaton model is unsuitable to describe quantum and relativistic properties of nature, as explained in his NKS book.

Jürgen Schmidhuber
Jürgen Schmidhuber
Jürgen Schmidhuber is a computer scientist and artist known for his work on machine learning, universal Artificial Intelligence , artificial neural networks, digital physics, and low-complexity art. His contributions also include generalizations of Kolmogorov complexity and the Speed Prior...

 has also charged that his work on Turing machine
Turing machine
A Turing machine is a theoretical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a...

-computable physics
Physics
Physics is a natural science that involves the study of matter and its motion through spacetime, along with related concepts such as energy and force. More broadly, it is the general analysis of nature, conducted in order to understand how the universe behaves.Physics is one of the oldest academic...

 was stolen without attribution, namely his idea on enumerating possible Turing-computable universes.

Additionally, the core idea that very simple rules often generate great complexity is already an established idea in science, particularly in chaos theory
Chaos theory
Chaos theory is a field of study in mathematics, with applications in several disciplines including physics, economics, biology, and philosophy. Chaos theory studies the behavior of dynamical systems that are highly sensitive to initial conditions, an effect which is popularly referred to as the...

 and complex systems
Complex systems
Complex systems present problems in mathematical modelling.The equations from which complex system models are developed generally derive from statistical physics, information theory and non-linear dynamics, and represent organized but unpredictable behaviors of systems of nature that are considered...

 research - and to some researchers, this field is considered well-understood. The authoritative manner in which NKS presents a vast number of examples and arguments has been criticized as leading the reader to believe that each of these ideas was original to Wolfram, however the notes section at the end of his book acknowledges many of the discoveries made by these other scientists citing their names together with historical facts, although not in the form of a traditional bibliography section. This is generally considered insufficient in scientific literature, however - end notes are normally reserved for only indirectly related material, and lay readers typically ignore end notes, resulting in the impression that the author performed all the work.

In particular, one of the most substantial new technical results presented in the book, that the rule 110 cellular automaton is Turing complete, was not proven by Wolfram, but by his research assistant, Matthew Cook
Matthew Cook
Matthew Cook is a mathematician and computer scientist who proved Stephen Wolfram's conjecture that the Rule 110 cellular automaton is Turing-complete. Rule 110 is arguably the simplest Turing-complete system currently known....

. This is not particularly a surprise since as explained by Wolfram himself, the book was actually a kind of project involving a group of his research assistants led by Wolfram himself, something that means he didn't do every single experiment or contribution directly, as he also acknowledges in the book. The research assistants were however paid for this work as hired by Wolfram's company, not by a university.

Some have argued that the use of computer simulation
Computer simulation
A computer simulation, a computer model, or a computational model is a computer program, or network of computers, that attempts to simulate an abstract model of a particular system...

 is ubiquitous, and instead of starting a paradigm shift
Paradigm shift
A Paradigm shift is, according to Thomas Kuhn in his influential book The Structure of Scientific Revolutions , a change in the basic assumptions, or paradigms, within the ruling theory of science...

 NKS just adds justification to a paradigm shift that has been undertaken. Wolfram's NKS might then seem as the book explicitly describing this shift.

See also

  • Scientific reductionism
  • Calculating Space
    Calculating Space
    Calculating Space is the title of MIT's English translation of Konrad Zuse's 1969 book Rechnender Raum , the first book on digital physics....

  • Fredkin Finite Nature Hypothesis
    Fredkin finite nature hypothesis
    In digital physics, the Fredkin Finite Nature Hypothesis states that ultimately all quantities of physics, including space and time, are discrete and finite. All measurable physical quantities arise from some Planck scale substrate for multiverse information processing...

  • Marcus Hutter
    Marcus Hutter
    Marcus Hutter is a German computer scientist and professor at the Australian National University. Hutter was born and educated in Munich, where he studied physics and computer science...

    's "Universal Artificial Intelligence" algorithm

External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK