Outline of probability
Encyclopedia
Probability
is the likelihood or chance that something is the case or will happen. Probability theory is used extensively in statistics
, mathematics
, science
and philosophy
to draw conclusions about the likelihood of potential events and the underlying mechanics of complex systems.
The following outline is provided as an overview and guide to the variety of topics included within the subject of probability.
Conditional probability
)
Some common stochastic process
Time series
Martingales
Probability
Probability is ordinarily used to describe an attitude of mind towards some proposition of whose truth we arenot certain. The proposition of interest is usually of the form "Will a specific event occur?" The attitude of mind is of the form "How certain are we that the event will occur?" The...
is the likelihood or chance that something is the case or will happen. Probability theory is used extensively in statistics
Statistics
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....
, mathematics
Mathematics
Mathematics is the study of quantity, space, structure, and change. Mathematicians seek out patterns and formulate new conjectures. Mathematicians resolve the truth or falsity of conjectures by mathematical proofs, which are arguments sufficient to convince other mathematicians of their validity...
, science
Science
Science is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe...
and philosophy
Philosophy
Philosophy is the study of general and fundamental problems, such as those connected with existence, knowledge, values, reason, mind, and language. Philosophy is distinguished from other ways of addressing such problems by its critical, generally systematic approach and its reliance on rational...
to draw conclusions about the likelihood of potential events and the underlying mechanics of complex systems.
The following outline is provided as an overview and guide to the variety of topics included within the subject of probability.
Events
- Events in probability theoryEvent (probability theory)In probability theory, an event is a set of outcomes to which a probability is assigned. Typically, when the sample space is finite, any subset of the sample space is an event...
- Elementary eventElementary eventIn probability theory, an elementary event or atomic event is a singleton of a sample space. An outcome is an element of a sample space. An elementary event is a set containing exactly one outcome, not the outcome itself...
s, sample spaces, Venn diagramVenn diagramVenn diagrams or set diagrams are diagrams that show all possible logical relations between a finite collection of sets . Venn diagrams were conceived around 1880 by John Venn...
s - Mutual exclusivityMutually exclusiveIn layman's terms, two events are mutually exclusive if they cannot occur at the same time. An example is tossing a coin once, which can result in either heads or tails, but not both....
Conditional probabilityConditional probabilityIn probability theory, the "conditional probability of A given B" is the probability of A if B is known to occur. It is commonly notated P, and sometimes P_B. P can be visualised as the probability of event A when the sample space is restricted to event B...
- The law of total probabilityLaw of total probabilityIn probability theory, the law of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.-Statement:The law of total probability is the proposition that if \left\...
- LikelihoodLikelihoodLikelihood is a measure of how likely an event is, and can be expressed in terms of, for example, probability or odds in favor.-Likelihood function:...
- Bayes' theoremBayes' theoremIn probability theory and applications, Bayes' theorem relates the conditional probabilities P and P. It is commonly used in science and engineering. The theorem is named for Thomas Bayes ....
- Bayesian probabilityBayesian probabilityBayesian probability is one of the different interpretations of the concept of probability and belongs to the category of evidential probabilities. The Bayesian interpretation of probability can be seen as an extension of logic that enables reasoning with propositions, whose truth or falsity is...
Measure-theoretic probability
- Sample spaces, σ-algebrasSigma-algebraIn mathematics, a σ-algebra is a technical concept for a collection of sets satisfying certain properties. The main use of σ-algebras is in the definition of measures; specifically, the collection of sets over which a measure is defined is a σ-algebra...
and probability measureProbability measureIn mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as countable additivity...
s - Probability spaceProbability spaceIn probability theory, a probability space or a probability triple is a mathematical construct that models a real-world process consisting of states that occur randomly. A probability space is constructed with a specific kind of situation or experiment in mind...
s - "Almost surelyAlmost surelyIn probability theory, one says that an event happens almost surely if it happens with probability one. The concept is analogous to the concept of "almost everywhere" in measure theory...
"
Discrete and continuous random variables
- Discrete random variables: Probability mass functionProbability mass functionIn probability theory and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value...
s - Continuous random variables: Probability density functionProbability density functionIn probability theory, a probability density function , or density of a continuous random variable is a function that describes the relative likelihood for this random variable to occur at a given point. The probability for the random variable to fall within a particular region is given by the...
s - Normalizing constantNormalizing constantThe concept of a normalizing constant arises in probability theory and a variety of other areas of mathematics.-Definition and examples:In probability theory, a normalizing constant is a constant by which an everywhere non-negative function must be multiplied so the area under its graph is 1, e.g.,...
s - Cumulative distribution functionCumulative distribution functionIn probability theory and statistics, the cumulative distribution function , or just distribution function, describes the probability that a real-valued random variable X with a given probability distribution will be found at a value less than or equal to x. Intuitively, it is the "area so far"...
s - JointJoint distributionIn the study of probability, given two random variables X and Y that are defined on the same probability space, the joint distribution for X and Y defines the probability of events defined in terms of both X and Y...
*, marginalMarginal distributionIn probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. The term marginal variable is used to refer to those variables in the subset of variables being retained...
and conditionalConditional distributionGiven two jointly distributed random variables X and Y, the conditional probability distribution of Y given X is the probability distribution of Y when X is known to be a particular value...
distributions
Expectation
- ExpectationExpected valueIn probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...
(or meanMeanIn statistics, mean has two related meanings:* the arithmetic mean .* the expected value of a random variable, which is also called the population mean....
), varianceVarianceIn probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...
and covarianceCovarianceIn probability theory and statistics, covariance is a measure of how much two variables change together. Variance is a special case of the covariance when the two variables are identical.- Definition :...
- Jensen's inequalityJensen's inequalityIn mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906. Given its generality, the inequality appears in many forms depending on the context,...
- Jensen's inequality
- General moments about the mean
- CorrelatedCorrelationIn statistics, dependence refers to any statistical relationship between two random variables or two sets of data. Correlation refers to any of a broad class of statistical relationships involving dependence....
and uncorrelatedUncorrelatedIn probability theory and statistics, two real-valued random variables are said to be uncorrelated if their covariance is zero. Uncorrelatedness is by definition pairwise; i.e...
random variables - Conditional expectationConditional expectationIn probability theory, a conditional expectation is the expected value of a real random variable with respect to a conditional probability distribution....
:- law of total expectationLaw of total expectationThe proposition in probability theory known as the law of total expectation, the law of iterated expectations, the tower rule, the smoothing theorem, among other names, states that if X is an integrable random variable The proposition in probability theory known as the law of total expectation, ...
, law of total varianceLaw of total varianceIn probability theory, the law of total variance or variance decomposition formula states that if X and Y are random variables on the same probability space, and the variance of Y is finite, then...
- law of total expectation
- Fatou's lemmaFatou's lemmaIn mathematics, Fatou's lemma establishes an inequality relating the integral of the limit inferior of a sequence of functions to the limit inferior of integrals of these functions...
and the monotone and dominatedDominated convergence theoremIn measure theory, Lebesgue's dominated convergence theorem provides sufficient conditions under which two limit processes commute, namely Lebesgue integration and almost everywhere convergence of a sequence of functions...
convergence theorems - Markov's inequalityMarkov's inequalityIn probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant...
and Chebyshev's inequalityChebyshev's inequalityIn probability theory, Chebyshev’s inequality guarantees that in any data sample or probability distribution,"nearly all" values are close to the mean — the precise statement being that no more than 1/k2 of the distribution’s values can be more than k standard deviations away from the mean...
Independence
- Independent random variables
Some common distributions
- Discrete:
- constant (see also degenerate distribution),
- Bernoulli and binomial,
- negative binomialNegative binomial distributionIn probability theory and statistics, the negative binomial distribution is a discrete probability distribution of the number of successes in a sequence of Bernoulli trials before a specified number of failures occur...
, - (discrete) uniform,
- geometric,
- PoissonPoisson distributionIn probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since...
, and - hypergeometric.
- Continuous:
- (continuous) uniformUniform distribution (continuous)In probability theory and statistics, the continuous uniform distribution or rectangular distribution is a family of probability distributions such that for each member of the family, all intervals of the same length on the distribution's support are equally probable. The support is defined by...
, - exponentialExponential distributionIn probability theory and statistics, the exponential distribution is a family of continuous probability distributions. It describes the time between events in a Poisson process, i.e...
, - gamma,
- beta,
- normal (or Gaussian) and multivariate normal,
- χ-squared (or chi-squared),
- F-distribution,
- Student's t-distribution, and
- CauchyCauchy distributionThe Cauchy–Lorentz distribution, named after Augustin Cauchy and Hendrik Lorentz, is a continuous probability distribution. As a probability distribution, it is known as the Cauchy distribution, while among physicists, it is known as the Lorentz distribution, Lorentz function, or Breit–Wigner...
.
- (continuous) uniform
Some other distributions
- Cantor
- Fisher-Tippett (or Gumbel)
- Pareto
- Benford's lawBenford's lawBenford's law, also called the first-digit law, states that in lists of numbers from many real-life sources of data, the leading digit is distributed in a specific, non-uniform way...
Functions of random variables
- Sums of random variables*
- General functions of random variables*
- Borel's paradoxBorel's paradoxIn probability theory, the Borel–Kolmogorov paradox is a paradox relating to conditional probability with respect to an event of probability zero...
Common generating functions
- Probability-generating functionProbability-generating functionIn probability theory, the probability-generating function of a discrete random variable is a power series representation of the probability mass function of the random variable...
s - Moment-generating functionMoment-generating functionIn probability theory and statistics, the moment-generating function of any random variable is an alternative definition of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or...
s - Laplace transforms and Laplace-Stieltjes transformLaplace-Stieltjes transformThe Laplace–Stieltjes transform, named for Pierre-Simon Laplace and Thomas Joannes Stieltjes, is an integral transform similar to the Laplace transform. For real-valued functions, it is the Laplace transform of a Stieltjes measure, however it is often defined for functions with values in a Banach...
s - Characteristic functionCharacteristic functionIn mathematics, characteristic function can refer to any of several distinct concepts:* The most common and universal usage is as a synonym for indicator function, that is the function* In probability theory, the characteristic function of any probability distribution on the real line is given by...
s
Applications
- A proof of the central limit theorem*
- Random sums of random variables*
Convergence of random variables
(Related topics: convergenceModes of convergence (annotated index)
The purpose of this article is to serve as an annotated index of various modes of convergence and their logical relationships. For an expository article, see Modes of convergence...
)
Modes of convergence
- Convergence in distribution and convergence in probability,
- Convergence in mean, mean square and rth mean
- Almost sure convergence
- Skorokhod's representation theoremSkorokhod's representation theoremIn mathematics and statistics, Skorokhod's representation theorem is a result that shows that a weakly convergent sequence of probability measures whose limit measure is sufficiently well-behaved can be represented as the distribution/law of a pointwise convergent sequence of random variables...
*
Applications
- Central limit theoremCentral limit theoremIn probability theory, the central limit theorem states conditions under which the mean of a sufficiently large number of independent random variables, each with finite mean and variance, will be approximately normally distributed. The central limit theorem has a number of variants. In its common...
and Laws of large numbersLaw of large numbersIn probability theory, the law of large numbers is a theorem that describes the result of performing the same experiment a large number of times...
- Illustration of the central limit theoremIllustration of the central limit theoremThis article gives two concrete illustrations of the central limit theorem. Both involve the sum of independent and identically-distributed random variables and show how the probability distribution of the sum approaches the normal distribution as the number of terms in the sum increases.The first...
and a 'concrete' illustration - Berry-Esséen theorem
- Illustration of the central limit theorem
- Law of the iterated logarithmLaw of the iterated logarithmIn probability theory, the law of the iterated logarithm describes the magnitude of the fluctuations of a random walk. The original statement of the law of the iterated logarithm is due to A. Y. Khinchin . Another statement was given by A.N...
Some common stochastic processStochastic processIn probability theory, a stochastic process , or sometimes random process, is the counterpart to a deterministic process...
es
- Random walkRandom walkA random walk, sometimes denoted RW, is a mathematical formalisation of a trajectory that consists of taking successive random steps. For example, the path traced by a molecule as it travels in a liquid or a gas, the search path of a foraging animal, the price of a fluctuating stock and the...
- Poisson processPoisson processA Poisson process, named after the French mathematician Siméon-Denis Poisson , is a stochastic process in which events occur continuously and independently of one another...
- Compound Poisson processCompound Poisson processA compound Poisson process is a continuous-time stochastic process with jumps. The jumps arrive randomly according to a Poisson process and the size of the jumps is also random, with a specified probability distribution...
- Wiener processWiener processIn mathematics, the Wiener process is a continuous-time stochastic process named in honor of Norbert Wiener. It is often called standard Brownian motion, after Robert Brown...
- Geometric Brownian motionGeometric Brownian motionA geometric Brownian motion is a continuous-time stochastic process in which the logarithm of the randomly varying quantity follows a Brownian motion, also called a Wiener process...
- Fractional Brownian motion
- Brownian bridgeBrownian bridgeA Brownian bridge is a continuous-time stochastic process B whose probability distribution is the conditional probability distribution of a Wiener process W given the condition that B = B = 0.The expected value of the bridge is zero, with variance t, implying that the most...
- Ornstein-Uhlenbeck processOrnstein-Uhlenbeck processIn mathematics, the Ornstein–Uhlenbeck process , is a stochastic process that, roughly speaking, describes the velocity of a massive Brownian particle under the influence of friction...
- Gamma process
Markov processes
- Markov propertyMarkov propertyIn probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It was named after the Russian mathematician Andrey Markov....
- Branching processBranching processIn probability theory, a branching process is a Markov process that models a population in which each individual in generation n produces some random number of individuals in generation n + 1, according to a fixed probability distribution that does not vary from individual to...
- Galton–Watson process
- Markov chainMarkov chainA Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the...
- Examples of Markov chainsExamples of Markov chains- Board games played with dice :A game of snakes and ladders or any other game whose moves are determined entirely by dice is a Markov chain, indeed, an absorbing Markov chain. This is in contrast to card games such as blackjack, where the cards represent a 'memory' of the past moves. To see the...
- Examples of Markov chains
- Population processPopulation processIn applied probability, a population process is a Markov chain in which the state of the chain is analogous to the number of individuals in a population , and changes to the state are analogous to the addition or removal of individuals from the population.Although named by analogy to biological...
es - Applications to queueing theoryQueueing theoryQueueing theory is the mathematical study of waiting lines, or queues. The theory enables mathematical analysis of several related processes, including arriving at the queue, waiting in the queue , and being served at the front of the queue...
Stochastic differential equations
- Stochastic calculusStochastic calculusStochastic calculus is a branch of mathematics that operates on stochastic processes. It allows a consistent theory of integration to be defined for integrals of stochastic processes with respect to stochastic processes...
- DiffusionDiffusionMolecular diffusion, often called simply diffusion, is the thermal motion of all particles at temperatures above absolute zero. The rate of this movement is a function of temperature, viscosity of the fluid and the size of the particles...
s- Brownian motionBrownian motionBrownian motion or pedesis is the presumably random drifting of particles suspended in a fluid or the mathematical model used to describe such random movements, which is often called a particle theory.The mathematical model of Brownian motion has several real-world applications...
- Wiener equation
- Wiener processWiener processIn mathematics, the Wiener process is a continuous-time stochastic process named in honor of Norbert Wiener. It is often called standard Brownian motion, after Robert Brown...
- Brownian motion
Time seriesTime seriesIn statistics, signal processing, econometrics and mathematical finance, a time series is a sequence of data points, measured typically at successive times spaced at uniform time intervals. Examples of time series are the daily closing value of the Dow Jones index or the annual flow volume of the...
- Moving-averageMoving average modelIn time series analysis, the moving-average model is a common approach for modeling univariate time series models. The notation MA refers to the moving average model of order q:...
* and autoregressive* processes - Correlation functionCorrelation functionA correlation function is the correlation between random variables at two different points in space or time, usually as a function of the spatial or temporal distance between the points...
and autocorrelationAutocorrelationAutocorrelation is the cross-correlation of a signal with itself. Informally, it is the similarity between observations as a function of the time separation between them...
MartingalesMartingale (probability theory)In probability theory, a martingale is a model of a fair game where no knowledge of past events can help to predict future winnings. In particular, a martingale is a sequence of random variables for which, at a particular time in the realized sequence, the expectation of the next value in the...
- Martingale central limit theoremMartingale central limit theoremIn probability theory, the central limit theorem says that, under certain conditions, the sum of many independent identically-distributed random variables, when scaled appropriately, converges in distribution to a standard normal distribution...
* - Azuma's inequality
See also
- Catalog of articles in probability theoryCatalog of articles in probability theoryThis page lists articles related to Probability theory. In particular, it lists many articles corresponding to specific probability distributions. Such articles are marked here by a code of the form , which refers to number of random variables involved and the type of the distribution. For example ...
- Glossary of probability and statisticsGlossary of probability and statisticsThe following is a glossary of terms. It is not intended to be all-inclusive.- Concerned fields :*Probability theory*Algebra of random variables *Statistics*Measure theory*Estimation theory- Glossary :...
- Notation in probability and statistics
- List of mathematical probabilists
- List of probability distributions
- List of probability topics
- List of scientific journals in probability
- Timeline of probability and statisticsTimeline of probability and statisticsA timeline of probability and statistics-Before 1600:* 9th Century - Al-Kindi was the first to use statistics to decipher encrypted messages and developed the first code breaking algorithm in the House of Wisdom in Baghdad, based on frequency analysis...
- Topic outline of statistics