History of randomness
Encyclopedia
In ancient history, the concepts of chance and randomness
were intertwined with that of fate. Many ancient peoples threw dice to determine fate, and this later evolved into games of chance. At the same time, most ancient cultures used various methods of divination
to attempt to circumvent randomness and fate.
The Chinese were perhaps the earliest people to formalize odds and chance 3,000 years ago. The Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the sixteenth century that Italian mathematicians began to formalize the odds associated with various games of chance. The invention of modern calculus
had a positive impact on the formal study of randomness. In the 19th century the concept of entropy
was introduced in physics.
The early part of the twentieth century saw a rapid growth in the formal analysis of randomness, and mathematical foundations for probability were introduced, leading to its axiomatization in 1933. At the same time, the advent of quantum mechanics
changed the scientific perspective on determinacy. In the mid to late 20th-century, ideas of algorithmic information theory
introduced new dimensions to the field via the concept of algorithmic randomness.
Although randomness had often been viewed as an obstacle and a nuisance for many centuries, in the twentieth century computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases, such randomized algorithms are able to outperform the best deterministic methods.
Chinese, dating back to 2100 BC. The Chinese used dice before the Europeans, and have a long history of playing games of chance.
Over 3,000 years ago, the problems concerned with the tossing of several coins were considered in the I Ching, one of the oldest Chinese mathematical texts, that probably dates to 1150 BC. The two principal elements yin and yang
were combined in the I Ching in various forms to produce Heads and Tails permutations of the type HH, TH, HT, etc. and the Chinese seem to have been aware of Pascal's triangle
long before the Europeans formalized it in the 17th century. However, Western philosophy focused on the non-mathematical aspects of chance and randomness until the 16th century.
The development of the concept of chance throughout history has been very gradual. Historians have wondered why progress in the field of randomness was so slow, given that humans have encountered chance since antiquity. Deborah Bennett suggests that ordinary people face an inherent difficulty in understanding randomness, although the concept is often taken as being obvious and self-evident. She cites studies by Kahneman
and Tversky
; these concluded that statistical principles are not learned from everyday experience because people do not attend to the detail necessary to gain such knowledge.
The Greek philosophers were the earliest Western thinkers to address chance and randomness. Around 400 BC, Democritus
presented a view of the world as governed by the unambiguous laws of order and considered randomness as a subjective concept that only originated from the inability of humans to understand the nature of events. He used the example of two men who would send their servants to bring water at the same time to cause them to meet. The servants, unaware of the plan, would view the meeting as random.
Aristotle
saw chance and necessity as opposite forces. He argued that nature had rich and constant patterns that could not be the result of chance alone, but that these patterns never displayed the machine-like uniformity of necessary determinism. He viewed randomness as a genuine and widespread part of the world, but as subordinate to necessity and order. Aristotle classified events into three types: certain events that happen necessarily; probable events that happen in most cases; and unknowable events that happen by pure chance. He considered the outcome of games of chance as unknowable.
Around 300 BC Epicurus
proposed the concept that randomness exists by itself, independent of human knowledge. He believed that in the atomic world, atoms would swerve at random along their paths, bringing about randomness at higher levels.
For several centuries thereafter, the idea of chance continued to be intertwined with fate. Divination was practiced in many cultures, using diverse methods. The Chinese analyzed the cracks in turtle shells, while the Germans, who according to Tacitus
had the highest regards for lots and omens, utilized strips of bark. In the Roman Empire
, chance was personified by the Goddess Fortuna
. The Romans would partake in games of chance to simulate what Fortuna would have decided. In 49 BC
, Julius Caesar
allegedly decided on his fateful decision to cross the Rubicon
after throwing dice.
Aristotle's classification of events into the three classes: certain, probable and unknowable was adopted by Roman philosophers, but they had to reconcile it with deterministic Christian
teachings in which even events unknowable to man were considered to be predetermined by God. About 960 Bishop Wibold of Cambrai
correctly enumerated the 56 different outcomes (without permutations) of playing with three dice. No reference to playing cards has been found in Europe before 1350. The Church preached against card playing, and card games spread much more slowly than games based on dice. The Christian Church specifically forbade divination
; and wherever Christianity went, divination lost most of its old-time power.
Over the centuries, many Christian scholars wrestled with the conflict between the belief in free will
and its implied randomness, and the idea that God knows everything that happens. Saints Augustine and Aquinas tried to reach an accommodation between foreknowledge and free will, but Martin Luther
argued against randomness and took the position that God's omniscience renders human actions unavoidable and determined. In the 13th century, Thomas Aquinas
viewed randomness not as the result of a single cause, but of several causes coming together by chance. While he believed in the existence of randomness, he rejected it as an explanation of the end-directedness of nature, for he saw too many patterns in nature to have been obtained by chance.
The Greeks and Romans had not noticed the magnitudes of the relative frequencies of the games of chance. For centuries, chance was discussed in Europe with no mathematical foundation and it was only in the 16th century that Italian Mathematicians began to discuss the outcomes of games of chance as ratios. In his 1565 Liber de Lude Aleae (a gambler's manual published after his death) Gerolamo Cardano
wrote one of the first formal tracts to analyze the odds of winning at various games.
corresponded with Pierre de Fermat
, and much of the groundwork for probability theory was laid. Pascal's Wager
was noted for its early use of the concept of infinity
, and the first formal use of decision theory. The work of Pascal and Fermat influenced Leibniz's work on the infinitesimal calculus
, which in turn provided further momentum for the formal analysis of probability and randomness.
The first known suggestion for viewing randomness in terms of complexity was made by Leibniz in an obscure 17th-century document discovered after his death. Leibniz asked how one could know if a set of points on a piece of paper were selected at random (e.g. by splattering ink) or not. Given that for any set of finite points there is always a mathematical equation that can describe the points, (e.g. by Lagrangian interpolation) the question focuses on the way the points are expressed mathematically. Leibniz viewed the points as random if the function describing them had to be extremely complex. Three centuries later, the same concept was formalized as algorithmic randomness by A. N. Kolmogorov and Gregory Chaitin
as the minimal length of a computer program needed to describe a finite string as random.
The Doctrine of Chances
, the first textbook on probability theory
was published in 1718 and the field continued to grow thereafter. The frequency theory
approach to probability was first developed by Robert Ellis
and John Venn
late in the 19th century.
While the mathematical elite was making progress in understanding randomness from the 17th to the 19th century, the public at large continued to rely on practices such as fortune telling in the hope of taming chance. Fortunes were told in a multitude of ways both in the Orient (where fortune telling was later termed an addiction) and in Europe by gypsies and others. English practices such as the reading of eggs dropped in a glass were exported to Puritan communities in North America.
The term entropy
, which is now a key element in the study of randomness, was coined by Rudolf Clausius
in 1865 as he studied heat engines in the context of the second law of thermodynamics
. Clausius was the first to state "entropy always increases".
From the time of Newton
until about 1890, it was generally believed that if one knows the initial state of a system with great accuracy, and if all the forces acting on the system can be formulated with equal accuracy, it would be possible, in principle, to make predictions of the state of the universe for an infinitely long time. The limits to such predictions in physical systems became clear as early as 1893 when Henri Poincaré
showed that in the three-body problem
in astronomy, small changes to the initial state could result in large changes in trajectories during the numerical integration of the equations.
During the 19th century, as probability theory was formalized and better understood, the attitude towards "randomness as nuisance" began to be questioned. Goethe wrote:
The words of Goethe proved prophetic, when in the 20th century randomized algorithms were discovered as powerful tools. By the end of the 19th century, Newton's model of a mechanical universe was fading away as the statistical view of the collision of molecules in gases was studied by Maxwell and Boltzmann
. Boltzmann's equation S = k loge W (inscribed on his tombstone) first related entropy
with logarithm
s.
theory (e.g., classical, logical, frequency, propensity and subjective) became better understood, were discussed, compared and contrasted. A significant number of application areas were developed in this century, from finance to physics. In 1900 Louis Bachelier
applied Brownian motion
to evaluate stock options, effectively launching the fields of financial mathematics and stochastic processes.
Émile Borel
was one of the first mathematicians to formally address randomness in 1909, and introduced normal number
s. In 1919 Richard von Mises gave the first definition of algorithmic randomness via the impossibility of a gambling system
. He advanced the frequency theory of randomness in terms of what he called the collective, i.e. a random sequence
. Von Mises regarded the randomness of a collective as an empirical law, established by experience. He related the "disorder" or randomness of a collective to the lack of success of attempted gambling systems. This approach led him to suggest a definition of randomness that was later refined and made mathematically rigorous by Alonso Church by using computable function
s in 1940. Richard von Mises likened the principle of the impossibility of a gambling system to the principle of the conservation of energy
, a law that cannot be proven, but has held true in repeated experiments.
Von Mises never totally formalized his rules for sub-sequence selection, but in his 1940 paper "On the concept of random sequence", Alonzo Church
suggested that the functions used for place settings in the formalism of von Mises be recursive function
s rather than arbitrary functions of the initial segments of the sequence, appealing to the Church–Turing thesis
on effectiveness.
The advent of quantum mechanics
in the early 20th century and the formulation of the Heisenberg uncertainty principle in 1927 saw the end to the Newtonian mindset among physicists regarding the determinacy of nature. In quantum mechanics, there is not even a way to consider all observable elements in a system as random variables at once, since many observables do not commute.
By the early 1940s, the frequency theory approach to probability was well accepted within the Vienna circle
, but in the 1950s Karl Popper
proposed the propensity theory
. Given that the frequency approach cannot deal with "a single toss" of a coin, and can only address large ensembles or collectives, the single-case probabilities were treated as propensities or chances. The concept of propensity was also driven by the desire to handle single-case probability settings in quantum mechanics, e.g. the probability of decay of a specific atom at a specific moment. In more general terms, the frequency approach can not deal with the probability of the death of a specific person given that the death can not be repeated multiple times for that person. Karl Popper
echoed the same sentiment as Aristotle in viewing randomness as subordinate to order when he wrote that "the concept of chance is not opposed to the concept of law" in nature, provided one considers the laws of chance.
Claude Shannon's development of Information theory
in 1948 gave rise to the entropy view of randomness. In this view, randomness is the opposite of determinism in a stochastic process
. Hence if a stochastic system has entropy zero it has no randomness and any increase in entropy increases randomness. Shannon's formulation defaults to Boltzmann
's entropy in case all probabilities are equal. Entropy is now widely used in diverse fields of science from thermodynamics
to quantum chemistry
.
Martingale
s for the study of chance and betting strategies were introduced by Lévy
in the 1930s and were formalized by Doob
in the 1950s. The application of random walk hypothesis
in financial theory was first proposed by Maurice Kendall
in 1953. It was later promoted by Eugene Fama
and Burton Malkiel
.
Random strings were first studied by in the 1960s by A. N. Kolmogorov (who had provided the first axiomatic definition of probability theory in 1933), Chaitin and Martin-Löf. The algorithmic randomness of a string was defined as the minimum size of a program (e.g. in bits) executed on a universal computer that yields the string. Chaitin's Omega number later related randomness and the halting probability for programs.
In 1964, Benoît Mandelbrot
suggested that most statistical models approached only a first stage of dealing with indeterminism, and that they ignored many aspects of real world turbulance. In his 1997 he defined seven states of randomness
ranging from "mild to wild", with traditional randomness being at the mild end of the scale.
Despite mathematical advances, reliance on other methods of dealing with chance, such as fortune telling and astrology
continued in the 20th century. The government of Myanmar
reportedly shaped 20th century economic policy based on fortune telling and planned the move of the capital of the country based on the advice of astrologers. White House Chief of Staff
Donald Regan
criticized the involvement of astrologer Joan Quigley
in decisions made during Ronald Reagan's presidency in the 1980s. Quigley claims to have been the White House astrologer for seven years.
During the 20th century, limits
in dealing with randomness were better understood. The best-known example of both theoretical and operational limits on predictability is weather forecasting, simply because models have been used in the field since the 1950s. Predictions of weather and climate are necessarily uncertain. Observations of weather and climate are uncertain and incomplete, and the models into which the data are fed are uncertain. In 1961, Edward Lorenz noticed that a very small change to the initial data submitted to a computer program for weather simulation could result in a completely different weather scenario. This later became known as the butterfly effect
, often paraphrased as the question: "Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?". A key example of serious practical limits on predictability is in geology, where the ability to predict earthquakes
either on an individual or on a statistical basis remains a remote prospect.
In the late 1970s and early 1980s, computer scientists
began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases, such randomized algorithms outperform the best deterministic methods.
Randomness
Randomness has somewhat differing meanings as used in various fields. It also has common meanings which are connected to the notion of predictability of events....
were intertwined with that of fate. Many ancient peoples threw dice to determine fate, and this later evolved into games of chance. At the same time, most ancient cultures used various methods of divination
Divination
Divination is the attempt to gain insight into a question or situation by way of an occultic standardized process or ritual...
to attempt to circumvent randomness and fate.
The Chinese were perhaps the earliest people to formalize odds and chance 3,000 years ago. The Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the sixteenth century that Italian mathematicians began to formalize the odds associated with various games of chance. The invention of modern calculus
Calculus
Calculus is a branch of mathematics focused on limits, functions, derivatives, integrals, and infinite series. This subject constitutes a major part of modern mathematics education. It has two major branches, differential calculus and integral calculus, which are related by the fundamental theorem...
had a positive impact on the formal study of randomness. In the 19th century the concept of entropy
Entropy
Entropy is a thermodynamic property that can be used to determine the energy available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when...
was introduced in physics.
The early part of the twentieth century saw a rapid growth in the formal analysis of randomness, and mathematical foundations for probability were introduced, leading to its axiomatization in 1933. At the same time, the advent of quantum mechanics
Quantum mechanics
Quantum mechanics, also known as quantum physics or quantum theory, is a branch of physics providing a mathematical description of much of the dual particle-like and wave-like behavior and interactions of energy and matter. It departs from classical mechanics primarily at the atomic and subatomic...
changed the scientific perspective on determinacy. In the mid to late 20th-century, ideas of algorithmic information theory
Algorithmic information theory
Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information...
introduced new dimensions to the field via the concept of algorithmic randomness.
Although randomness had often been viewed as an obstacle and a nuisance for many centuries, in the twentieth century computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases, such randomized algorithms are able to outperform the best deterministic methods.
Antiquity to the Middle Ages
In ancient history, the concepts of chance and randomness were intertwined with that of fate. Pre-Christian people along the Mediterranean threw dice to determine fate, and this later evolved into games of chance. There is also evidence of games of chance played by ancient Egyptians, Hindus andChinese, dating back to 2100 BC. The Chinese used dice before the Europeans, and have a long history of playing games of chance.
Over 3,000 years ago, the problems concerned with the tossing of several coins were considered in the I Ching, one of the oldest Chinese mathematical texts, that probably dates to 1150 BC. The two principal elements yin and yang
Yin and yang
In Asian philosophy, the concept of yin yang , which is often referred to in the West as "yin and yang", is used to describe how polar opposites or seemingly contrary forces are interconnected and interdependent in the natural world, and how they give rise to each other in turn. Opposites thus only...
were combined in the I Ching in various forms to produce Heads and Tails permutations of the type HH, TH, HT, etc. and the Chinese seem to have been aware of Pascal's triangle
Pascal's triangle
In mathematics, Pascal's triangle is a triangular array of the binomial coefficients in a triangle. It is named after the French mathematician, Blaise Pascal...
long before the Europeans formalized it in the 17th century. However, Western philosophy focused on the non-mathematical aspects of chance and randomness until the 16th century.
The development of the concept of chance throughout history has been very gradual. Historians have wondered why progress in the field of randomness was so slow, given that humans have encountered chance since antiquity. Deborah Bennett suggests that ordinary people face an inherent difficulty in understanding randomness, although the concept is often taken as being obvious and self-evident. She cites studies by Kahneman
Daniel Kahneman
Daniel Kahneman is an Israeli-American psychologist and Nobel laureate. He is notable for his work on the psychology of judgment and decision-making, behavioral economics and hedonic psychology....
and Tversky
Amos Tversky
Amos Nathan Tversky, was a cognitive and mathematical psychologist, a pioneer of cognitive science, a longtime collaborator of Daniel Kahneman, and a key figure in the discovery of systematic human cognitive bias and handling of risk. Much of his early work concerned the foundations of measurement...
; these concluded that statistical principles are not learned from everyday experience because people do not attend to the detail necessary to gain such knowledge.
The Greek philosophers were the earliest Western thinkers to address chance and randomness. Around 400 BC, Democritus
Democritus
Democritus was an Ancient Greek philosopher born in Abdera, Thrace, Greece. He was an influential pre-Socratic philosopher and pupil of Leucippus, who formulated an atomic theory for the cosmos....
presented a view of the world as governed by the unambiguous laws of order and considered randomness as a subjective concept that only originated from the inability of humans to understand the nature of events. He used the example of two men who would send their servants to bring water at the same time to cause them to meet. The servants, unaware of the plan, would view the meeting as random.
Aristotle
Aristotle
Aristotle was a Greek philosopher and polymath, a student of Plato and teacher of Alexander the Great. His writings cover many subjects, including physics, metaphysics, poetry, theater, music, logic, rhetoric, linguistics, politics, government, ethics, biology, and zoology...
saw chance and necessity as opposite forces. He argued that nature had rich and constant patterns that could not be the result of chance alone, but that these patterns never displayed the machine-like uniformity of necessary determinism. He viewed randomness as a genuine and widespread part of the world, but as subordinate to necessity and order. Aristotle classified events into three types: certain events that happen necessarily; probable events that happen in most cases; and unknowable events that happen by pure chance. He considered the outcome of games of chance as unknowable.
Around 300 BC Epicurus
Epicurus
Epicurus was an ancient Greek philosopher and the founder of the school of philosophy called Epicureanism.Only a few fragments and letters remain of Epicurus's 300 written works...
proposed the concept that randomness exists by itself, independent of human knowledge. He believed that in the atomic world, atoms would swerve at random along their paths, bringing about randomness at higher levels.
For several centuries thereafter, the idea of chance continued to be intertwined with fate. Divination was practiced in many cultures, using diverse methods. The Chinese analyzed the cracks in turtle shells, while the Germans, who according to Tacitus
Tacitus
Publius Cornelius Tacitus was a senator and a historian of the Roman Empire. The surviving portions of his two major works—the Annals and the Histories—examine the reigns of the Roman Emperors Tiberius, Claudius, Nero and those who reigned in the Year of the Four Emperors...
had the highest regards for lots and omens, utilized strips of bark. In the Roman Empire
Roman Empire
The Roman Empire was the post-Republican period of the ancient Roman civilization, characterised by an autocratic form of government and large territorial holdings in Europe and around the Mediterranean....
, chance was personified by the Goddess Fortuna
Fortuna
Fortuna can mean:*Fortuna, the Roman goddess of luck -Geographical:*19 Fortuna, asteroid*Fortuna, California, town located on the north coast of California*Fortuna, United States Virgin Islands...
. The Romans would partake in games of chance to simulate what Fortuna would have decided. In 49 BC
49 BC
Year 49 BC was a year of the pre-Julian Roman calendar. At the time, it was known as the Year of the Consulship of Lentulus and Marcellus...
, Julius Caesar
Julius Caesar
Gaius Julius Caesar was a Roman general and statesman and a distinguished writer of Latin prose. He played a critical role in the gradual transformation of the Roman Republic into the Roman Empire....
allegedly decided on his fateful decision to cross the Rubicon
Rubicon
The Rubicon is a shallow river in northeastern Italy, about 80 kilometres long, running from the Apennine Mountains to the Adriatic Sea through the southern Emilia-Romagna region, between the towns of Rimini and Cesena. The Latin word rubico comes from the adjective "rubeus", meaning "red"...
after throwing dice.
Aristotle's classification of events into the three classes: certain, probable and unknowable was adopted by Roman philosophers, but they had to reconcile it with deterministic Christian
Christian
A Christian is a person who adheres to Christianity, an Abrahamic, monotheistic religion based on the life and teachings of Jesus of Nazareth as recorded in the Canonical gospels and the letters of the New Testament...
teachings in which even events unknowable to man were considered to be predetermined by God. About 960 Bishop Wibold of Cambrai
Cambrai
Cambrai is a commune in the Nord department in northern France. It is a sub-prefecture of the department.Cambrai is the seat of an archdiocese whose jurisdiction was immense during the Middle Ages. The territory of the Bishopric of Cambrai, roughly coinciding with the shire of Brabant, included...
correctly enumerated the 56 different outcomes (without permutations) of playing with three dice. No reference to playing cards has been found in Europe before 1350. The Church preached against card playing, and card games spread much more slowly than games based on dice. The Christian Church specifically forbade divination
Divination
Divination is the attempt to gain insight into a question or situation by way of an occultic standardized process or ritual...
; and wherever Christianity went, divination lost most of its old-time power.
Over the centuries, many Christian scholars wrestled with the conflict between the belief in free will
Free will
"To make my own decisions whether I am successful or not due to uncontrollable forces" -Troy MorrisonA pragmatic definition of free willFree will is the ability of agents to make choices free from certain kinds of constraints. The existence of free will and its exact nature and definition have long...
and its implied randomness, and the idea that God knows everything that happens. Saints Augustine and Aquinas tried to reach an accommodation between foreknowledge and free will, but Martin Luther
Martin Luther
Martin Luther was a German priest, professor of theology and iconic figure of the Protestant Reformation. He strongly disputed the claim that freedom from God's punishment for sin could be purchased with money. He confronted indulgence salesman Johann Tetzel with his Ninety-Five Theses in 1517...
argued against randomness and took the position that God's omniscience renders human actions unavoidable and determined. In the 13th century, Thomas Aquinas
Thomas Aquinas
Thomas Aquinas, O.P. , also Thomas of Aquin or Aquino, was an Italian Dominican priest of the Catholic Church, and an immensely influential philosopher and theologian in the tradition of scholasticism, known as Doctor Angelicus, Doctor Communis, or Doctor Universalis...
viewed randomness not as the result of a single cause, but of several causes coming together by chance. While he believed in the existence of randomness, he rejected it as an explanation of the end-directedness of nature, for he saw too many patterns in nature to have been obtained by chance.
The Greeks and Romans had not noticed the magnitudes of the relative frequencies of the games of chance. For centuries, chance was discussed in Europe with no mathematical foundation and it was only in the 16th century that Italian Mathematicians began to discuss the outcomes of games of chance as ratios. In his 1565 Liber de Lude Aleae (a gambler's manual published after his death) Gerolamo Cardano
Gerolamo Cardano
Gerolamo Cardano was an Italian Renaissance mathematician, physician, astrologer and gambler...
wrote one of the first formal tracts to analyze the odds of winning at various games.
17th–19th centuries
Around 1620 Galileo wrote a paper called On a discovery concerning dice that used an early probabilistic model to address specific questions. In 1654, prompted by Chevalier de Méré's interest in gambling, Blaise PascalBlaise Pascal
Blaise Pascal , was a French mathematician, physicist, inventor, writer and Catholic philosopher. He was a child prodigy who was educated by his father, a tax collector in Rouen...
corresponded with Pierre de Fermat
Pierre de Fermat
Pierre de Fermat was a French lawyer at the Parlement of Toulouse, France, and an amateur mathematician who is given credit for early developments that led to infinitesimal calculus, including his adequality...
, and much of the groundwork for probability theory was laid. Pascal's Wager
Pascal's Wager
Pascal's Wager, also known as Pascal's Gambit, is a suggestion posed by the French philosopher, mathematician, and physicist Blaise Pascal that even if the existence of God could not be determined through reason, a rational person should wager as though God exists, because one living life...
was noted for its early use of the concept of infinity
Infinity
Infinity is a concept in many fields, most predominantly mathematics and physics, that refers to a quantity without bound or end. People have developed various ideas throughout history about the nature of infinity...
, and the first formal use of decision theory. The work of Pascal and Fermat influenced Leibniz's work on the infinitesimal calculus
Infinitesimal calculus
Infinitesimal calculus is the part of mathematics concerned with finding slope of curves, areas under curves, minima and maxima, and other geometric and analytic problems. It was independently developed by Gottfried Leibniz and Isaac Newton starting in the 1660s...
, which in turn provided further momentum for the formal analysis of probability and randomness.
The first known suggestion for viewing randomness in terms of complexity was made by Leibniz in an obscure 17th-century document discovered after his death. Leibniz asked how one could know if a set of points on a piece of paper were selected at random (e.g. by splattering ink) or not. Given that for any set of finite points there is always a mathematical equation that can describe the points, (e.g. by Lagrangian interpolation) the question focuses on the way the points are expressed mathematically. Leibniz viewed the points as random if the function describing them had to be extremely complex. Three centuries later, the same concept was formalized as algorithmic randomness by A. N. Kolmogorov and Gregory Chaitin
Gregory Chaitin
Gregory John Chaitin is an Argentine-American mathematician and computer scientist.-Mathematics and computer science:Beginning in 2009 Chaitin has worked on metabiology, a field parallel to biology dealing with the random evolution of artificial software instead of natural software .Beginning in...
as the minimal length of a computer program needed to describe a finite string as random.
The Doctrine of Chances
The Doctrine of Chances
The Doctrine of Chances was the first textbook on probability theory, written by 18th-century French mathematician Abraham de Moivre and first published in 1718. De Moivre wrote in English because he resided in England at the time, having fled France to escape the persecution of Huguenots...
, the first textbook on probability theory
Probability theory
Probability theory is the branch of mathematics concerned with analysis of random phenomena. The central objects of probability theory are random variables, stochastic processes, and events: mathematical abstractions of non-deterministic events or measured quantities that may either be single...
was published in 1718 and the field continued to grow thereafter. The frequency theory
Frequency probability
Frequency probability is the interpretation of probability that defines an event's probability as the limit of its relative frequency in a large number of trials. The development of the frequentist account was motivated by the problems and paradoxes of the previously dominant viewpoint, the...
approach to probability was first developed by Robert Ellis
Robert Leslie Ellis
Robert Leslie Ellis was an English polymath, remembered principally as a mathematician and editor of the works of Francis Bacon....
and John Venn
John Venn
Donald A. Venn FRS , was a British logician and philosopher. He is famous for introducing the Venn diagram, which is used in many fields, including set theory, probability, logic, statistics, and computer science....
late in the 19th century.
While the mathematical elite was making progress in understanding randomness from the 17th to the 19th century, the public at large continued to rely on practices such as fortune telling in the hope of taming chance. Fortunes were told in a multitude of ways both in the Orient (where fortune telling was later termed an addiction) and in Europe by gypsies and others. English practices such as the reading of eggs dropped in a glass were exported to Puritan communities in North America.
The term entropy
Entropy
Entropy is a thermodynamic property that can be used to determine the energy available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when...
, which is now a key element in the study of randomness, was coined by Rudolf Clausius
Rudolf Clausius
Rudolf Julius Emanuel Clausius , was a German physicist and mathematician and is considered one of the central founders of the science of thermodynamics. By his restatement of Sadi Carnot's principle known as the Carnot cycle, he put the theory of heat on a truer and sounder basis...
in 1865 as he studied heat engines in the context of the second law of thermodynamics
Second law of thermodynamics
The second law of thermodynamics is an expression of the tendency that over time, differences in temperature, pressure, and chemical potential equilibrate in an isolated physical system. From the state of thermodynamic equilibrium, the law deduced the principle of the increase of entropy and...
. Clausius was the first to state "entropy always increases".
From the time of Newton
Isaac Newton
Sir Isaac Newton PRS was an English physicist, mathematician, astronomer, natural philosopher, alchemist, and theologian, who has been "considered by many to be the greatest and most influential scientist who ever lived."...
until about 1890, it was generally believed that if one knows the initial state of a system with great accuracy, and if all the forces acting on the system can be formulated with equal accuracy, it would be possible, in principle, to make predictions of the state of the universe for an infinitely long time. The limits to such predictions in physical systems became clear as early as 1893 when Henri Poincaré
Henri Poincaré
Jules Henri Poincaré was a French mathematician, theoretical physicist, engineer, and a philosopher of science...
showed that in the three-body problem
Three-body problem
Three-body problem has two distinguishable meanings in physics and classical mechanics:# In its traditional sense the three-body problem is the problem of taking an initial set of data that specifies the positions, masses and velocities of three bodies for some particular point in time and then...
in astronomy, small changes to the initial state could result in large changes in trajectories during the numerical integration of the equations.
During the 19th century, as probability theory was formalized and better understood, the attitude towards "randomness as nuisance" began to be questioned. Goethe wrote:
The tissue of the world
is built from necessities and randomness;
the intellect of men places itself between both
and can control them;
it considers the necessity
and the reason of its existence;
it knows how randomness can be
managed, controlled, and used.
The words of Goethe proved prophetic, when in the 20th century randomized algorithms were discovered as powerful tools. By the end of the 19th century, Newton's model of a mechanical universe was fading away as the statistical view of the collision of molecules in gases was studied by Maxwell and Boltzmann
Ludwig Boltzmann
Ludwig Eduard Boltzmann was an Austrian physicist famous for his founding contributions in the fields of statistical mechanics and statistical thermodynamics...
. Boltzmann's equation S = k loge W (inscribed on his tombstone) first related entropy
Entropy
Entropy is a thermodynamic property that can be used to determine the energy available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when...
with logarithm
Logarithm
The logarithm of a number is the exponent by which another fixed value, the base, has to be raised to produce that number. For example, the logarithm of 1000 to base 10 is 3, because 1000 is 10 to the power 3: More generally, if x = by, then y is the logarithm of x to base b, and is written...
s.
20th century
During the 20th century, the five main interpretations of probabilityProbability interpretations
The word probability has been used in a variety of ways since it was first coined in relation to games of chance. Does probability measure the real, physical tendency of something to occur, or is it just a measure of how strongly one believes it will occur? In answering such questions, we...
theory (e.g., classical, logical, frequency, propensity and subjective) became better understood, were discussed, compared and contrasted. A significant number of application areas were developed in this century, from finance to physics. In 1900 Louis Bachelier
Louis Bachelier
-External links:** Louis Bachelier webpage at the Université de Franche-Comté, Besançon / France. Text in French.** also from Index Funds Advisors, this discussion of...
applied Brownian motion
Brownian motion
Brownian motion or pedesis is the presumably random drifting of particles suspended in a fluid or the mathematical model used to describe such random movements, which is often called a particle theory.The mathematical model of Brownian motion has several real-world applications...
to evaluate stock options, effectively launching the fields of financial mathematics and stochastic processes.
Émile Borel
Émile Borel
Félix Édouard Justin Émile Borel was a French mathematician and politician.Borel was born in Saint-Affrique, Aveyron. Along with René-Louis Baire and Henri Lebesgue, he was among the pioneers of measure theory and its application to probability theory. The concept of a Borel set is named in his...
was one of the first mathematicians to formally address randomness in 1909, and introduced normal number
Normal number
In mathematics, a normal number is a real number whose infinite sequence of digits in every base b is distributed uniformly in the sense that each of the b digit values has the same natural density 1/b, also all possible b2 pairs of digits are equally likely with density b−2,...
s. In 1919 Richard von Mises gave the first definition of algorithmic randomness via the impossibility of a gambling system
Impossibility of a gambling system
The principle of the impossibility of a gambling system is a concept in probability. It states that in a random sequence, the selection of sub-sequences does not change the probability of specific elements...
. He advanced the frequency theory of randomness in terms of what he called the collective, i.e. a random sequence
Random sequence
The concept of a random sequence is essential in probability theory and statistics. The concept generally relies on the notion of a sequence of random variables and many statistical discussions begin with the words "let X1,...,Xn be independent random variables...". Yet as D. H. Lehmer stated in...
. Von Mises regarded the randomness of a collective as an empirical law, established by experience. He related the "disorder" or randomness of a collective to the lack of success of attempted gambling systems. This approach led him to suggest a definition of randomness that was later refined and made mathematically rigorous by Alonso Church by using computable function
Computable function
Computable functions are the basic objects of study in computability theory. Computable functions are the formalized analogue of the intuitive notion of algorithm. They are used to discuss computability without referring to any concrete model of computation such as Turing machines or register...
s in 1940. Richard von Mises likened the principle of the impossibility of a gambling system to the principle of the conservation of energy
Conservation of energy
The nineteenth century law of conservation of energy is a law of physics. It states that the total amount of energy in an isolated system remains constant over time. The total energy is said to be conserved over time...
, a law that cannot be proven, but has held true in repeated experiments.
Von Mises never totally formalized his rules for sub-sequence selection, but in his 1940 paper "On the concept of random sequence", Alonzo Church
Alonzo Church
Alonzo Church was an American mathematician and logician who made major contributions to mathematical logic and the foundations of theoretical computer science. He is best known for the lambda calculus, Church–Turing thesis, Frege–Church ontology, and the Church–Rosser theorem.-Life:Alonzo Church...
suggested that the functions used for place settings in the formalism of von Mises be recursive function
Recursive function
Recursive function may refer to:*Recursion , a procedure or subroutine, implemented in a programming language, whose implementation references itself*A total computable function, a function which is defined for all possible inputs...
s rather than arbitrary functions of the initial segments of the sequence, appealing to the Church–Turing thesis
Church–Turing thesis
In computability theory, the Church–Turing thesis is a combined hypothesis about the nature of functions whose values are effectively calculable; in more modern terms, algorithmically computable...
on effectiveness.
The advent of quantum mechanics
Quantum mechanics
Quantum mechanics, also known as quantum physics or quantum theory, is a branch of physics providing a mathematical description of much of the dual particle-like and wave-like behavior and interactions of energy and matter. It departs from classical mechanics primarily at the atomic and subatomic...
in the early 20th century and the formulation of the Heisenberg uncertainty principle in 1927 saw the end to the Newtonian mindset among physicists regarding the determinacy of nature. In quantum mechanics, there is not even a way to consider all observable elements in a system as random variables at once, since many observables do not commute.
By the early 1940s, the frequency theory approach to probability was well accepted within the Vienna circle
Vienna Circle
The Vienna Circle was an association of philosophers gathered around the University of Vienna in 1922, chaired by Moritz Schlick, also known as the Ernst Mach Society in honour of Ernst Mach...
, but in the 1950s Karl Popper
Karl Popper
Sir Karl Raimund Popper, CH FRS FBA was an Austro-British philosopher and a professor at the London School of Economics...
proposed the propensity theory
Propensity probability
The propensity theory of probability is one interpretation of the concept of probability. Theorists who adopt this interpretation think of probability as a physical propensity, or disposition, or tendency of a given type of physical situation to yield an outcome of a certain kind, or to yield a...
. Given that the frequency approach cannot deal with "a single toss" of a coin, and can only address large ensembles or collectives, the single-case probabilities were treated as propensities or chances. The concept of propensity was also driven by the desire to handle single-case probability settings in quantum mechanics, e.g. the probability of decay of a specific atom at a specific moment. In more general terms, the frequency approach can not deal with the probability of the death of a specific person given that the death can not be repeated multiple times for that person. Karl Popper
Karl Popper
Sir Karl Raimund Popper, CH FRS FBA was an Austro-British philosopher and a professor at the London School of Economics...
echoed the same sentiment as Aristotle in viewing randomness as subordinate to order when he wrote that "the concept of chance is not opposed to the concept of law" in nature, provided one considers the laws of chance.
Claude Shannon's development of Information theory
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...
in 1948 gave rise to the entropy view of randomness. In this view, randomness is the opposite of determinism in a stochastic process
Stochastic process
In probability theory, a stochastic process , or sometimes random process, is the counterpart to a deterministic process...
. Hence if a stochastic system has entropy zero it has no randomness and any increase in entropy increases randomness. Shannon's formulation defaults to Boltzmann
Ludwig Boltzmann
Ludwig Eduard Boltzmann was an Austrian physicist famous for his founding contributions in the fields of statistical mechanics and statistical thermodynamics...
's entropy in case all probabilities are equal. Entropy is now widely used in diverse fields of science from thermodynamics
Thermodynamics
Thermodynamics is a physical science that studies the effects on material bodies, and on radiation in regions of space, of transfer of heat and of work done on or by the bodies or radiation...
to quantum chemistry
Quantum chemistry
Quantum chemistry is a branch of chemistry whose primary focus is the application of quantum mechanics in physical models and experiments of chemical systems...
.
Martingale
Martingale (probability theory)
In probability theory, a martingale is a model of a fair game where no knowledge of past events can help to predict future winnings. In particular, a martingale is a sequence of random variables for which, at a particular time in the realized sequence, the expectation of the next value in the...
s for the study of chance and betting strategies were introduced by Lévy
Paul Pierre Lévy
Paul Pierre Lévy was a Jewish French mathematician who was active especially in probability theory, introducing martingales and Lévy flights...
in the 1930s and were formalized by Doob
Joseph Leo Doob
Joseph Leo Doob was an American mathematician, specializing in analysis and probability theory.The theory of martingales was developed by Doob.-Early life and education:...
in the 1950s. The application of random walk hypothesis
Random walk hypothesis
The random walk hypothesis is a financial theory stating that stock market prices evolve according to a random walk and thus the prices of the stock market cannot be predicted. It is consistent with the efficient-market hypothesis....
in financial theory was first proposed by Maurice Kendall
Maurice Kendall
Sir Maurice George Kendall, FBA was a British statistician, widely known for his contribution to statistics. The Kendall tau rank correlation is named after him.-Education and early life:...
in 1953. It was later promoted by Eugene Fama
Eugene Fama
Eugene Francis "Gene" Fama is an American economist, known for his work on portfolio theory and asset pricing, both theoretical and empirical. He is currently Robert R...
and Burton Malkiel
Burton Malkiel
Burton Gordon Malkiel is an American economist and writer, most famous for his classic finance book A Random Walk Down Wall Street...
.
Random strings were first studied by in the 1960s by A. N. Kolmogorov (who had provided the first axiomatic definition of probability theory in 1933), Chaitin and Martin-Löf. The algorithmic randomness of a string was defined as the minimum size of a program (e.g. in bits) executed on a universal computer that yields the string. Chaitin's Omega number later related randomness and the halting probability for programs.
In 1964, Benoît Mandelbrot
Benoît Mandelbrot
Benoît B. Mandelbrot was a French American mathematician. Born in Poland, he moved to France with his family when he was a child...
suggested that most statistical models approached only a first stage of dealing with indeterminism, and that they ignored many aspects of real world turbulance. In his 1997 he defined seven states of randomness
Seven states of randomness
The seven states of randomness in probability theory, fractals and risk analysis are extensions of the concept of normal distribution. These seven states were first introduced in by Benoît Mandelbrot in his 1997 book Fractals and scaling in finance which applied fractal analysis to the study of...
ranging from "mild to wild", with traditional randomness being at the mild end of the scale.
Despite mathematical advances, reliance on other methods of dealing with chance, such as fortune telling and astrology
Astrology
Astrology consists of a number of belief systems which hold that there is a relationship between astronomical phenomena and events in the human world...
continued in the 20th century. The government of Myanmar
Myanmar
Burma , officially the Republic of the Union of Myanmar , is a country in Southeast Asia. Burma is bordered by China on the northeast, Laos on the east, Thailand on the southeast, Bangladesh on the west, India on the northwest, the Bay of Bengal to the southwest, and the Andaman Sea on the south....
reportedly shaped 20th century economic policy based on fortune telling and planned the move of the capital of the country based on the advice of astrologers. White House Chief of Staff
White House Chief of Staff
The White House Chief of Staff is the highest ranking member of the Executive Office of the President of the United States and a senior aide to the President.The current White House Chief of Staff is Bill Daley.-History:...
Donald Regan
Donald Regan
Donald Thomas Regan ,was the 66th United States Secretary of the Treasury, from 1981 to 1985, and Chief of Staff from 1985 to 1987 in the Ronald Reagan Administration, where he advocated "Reaganomics" and tax cuts to create jobs and stimulate production.-Early life:Born in Cambridge, Massachusetts,...
criticized the involvement of astrologer Joan Quigley
Joan Quigley
Joan Quigley , of San Francisco, is an astrologer best known for her astrological advice to the Reagan White House in the 1980s...
in decisions made during Ronald Reagan's presidency in the 1980s. Quigley claims to have been the White House astrologer for seven years.
During the 20th century, limits
Limit (mathematics)
In mathematics, the concept of a "limit" is used to describe the value that a function or sequence "approaches" as the input or index approaches some value. The concept of limit allows mathematicians to define a new point from a Cauchy sequence of previously defined points within a complete metric...
in dealing with randomness were better understood. The best-known example of both theoretical and operational limits on predictability is weather forecasting, simply because models have been used in the field since the 1950s. Predictions of weather and climate are necessarily uncertain. Observations of weather and climate are uncertain and incomplete, and the models into which the data are fed are uncertain. In 1961, Edward Lorenz noticed that a very small change to the initial data submitted to a computer program for weather simulation could result in a completely different weather scenario. This later became known as the butterfly effect
Butterfly effect
In chaos theory, the butterfly effect is the sensitive dependence on initial conditions; where a small change at one place in a nonlinear system can result in large differences to a later state...
, often paraphrased as the question: "Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?". A key example of serious practical limits on predictability is in geology, where the ability to predict earthquakes
Earthquake prediction
An earthquake prediction is a prediction that an earthquake of a specific magnitude will occur in a particular place at a particular time . Despite considerable research efforts by seismologists, scientifically reproducible predictions cannot yet be made to a specific day or month...
either on an individual or on a statistical basis remains a remote prospect.
In the late 1970s and early 1980s, computer scientists
Computer science
Computer science or computing science is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems...
began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases, such randomized algorithms outperform the best deterministic methods.