Outline of probability
Encyclopedia
Probability
Probability
Probability is ordinarily used to describe an attitude of mind towards some proposition of whose truth we arenot certain. The proposition of interest is usually of the form "Will a specific event occur?" The attitude of mind is of the form "How certain are we that the event will occur?" The...

is the likelihood or chance that something is the case or will happen. Probability theory is used extensively in statistics
Statistics
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....

, mathematics
Mathematics
Mathematics is the study of quantity, space, structure, and change. Mathematicians seek out patterns and formulate new conjectures. Mathematicians resolve the truth or falsity of conjectures by mathematical proofs, which are arguments sufficient to convince other mathematicians of their validity...

, science
Science
Science is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe...

 and philosophy
Philosophy
Philosophy is the study of general and fundamental problems, such as those connected with existence, knowledge, values, reason, mind, and language. Philosophy is distinguished from other ways of addressing such problems by its critical, generally systematic approach and its reliance on rational...

 to draw conclusions about the likelihood of potential events and the underlying mechanics of complex systems.

The following outline is provided as an overview and guide to the variety of topics included within the subject of probability.

Events

  • Events in probability theory
    Event (probability theory)
    In probability theory, an event is a set of outcomes to which a probability is assigned. Typically, when the sample space is finite, any subset of the sample space is an event...

  • Elementary event
    Elementary event
    In probability theory, an elementary event or atomic event is a singleton of a sample space. An outcome is an element of a sample space. An elementary event is a set containing exactly one outcome, not the outcome itself...

    s, sample spaces, Venn diagram
    Venn diagram
    Venn diagrams or set diagrams are diagrams that show all possible logical relations between a finite collection of sets . Venn diagrams were conceived around 1880 by John Venn...

    s
  • Mutual exclusivity
    Mutually exclusive
    In layman's terms, two events are mutually exclusive if they cannot occur at the same time. An example is tossing a coin once, which can result in either heads or tails, but not both....


Conditional probability
Conditional probability
In probability theory, the "conditional probability of A given B" is the probability of A if B is known to occur. It is commonly notated P, and sometimes P_B. P can be visualised as the probability of event A when the sample space is restricted to event B...

  • The law of total probability
    Law of total probability
    In probability theory, the law of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.-Statement:The law of total probability is the proposition that if \left\...

  • Likelihood
    Likelihood
    Likelihood is a measure of how likely an event is, and can be expressed in terms of, for example, probability or odds in favor.-Likelihood function:...

  • Bayes' theorem
    Bayes' theorem
    In probability theory and applications, Bayes' theorem relates the conditional probabilities P and P. It is commonly used in science and engineering. The theorem is named for Thomas Bayes ....

  • Bayesian probability
    Bayesian probability
    Bayesian probability is one of the different interpretations of the concept of probability and belongs to the category of evidential probabilities. The Bayesian interpretation of probability can be seen as an extension of logic that enables reasoning with propositions, whose truth or falsity is...


Measure-theoretic probability

  • Sample spaces, σ-algebras
    Sigma-algebra
    In mathematics, a σ-algebra is a technical concept for a collection of sets satisfying certain properties. The main use of σ-algebras is in the definition of measures; specifically, the collection of sets over which a measure is defined is a σ-algebra...

     and probability measure
    Probability measure
    In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as countable additivity...

    s
  • Probability space
    Probability space
    In probability theory, a probability space or a probability triple is a mathematical construct that models a real-world process consisting of states that occur randomly. A probability space is constructed with a specific kind of situation or experiment in mind...

    s
  • "Almost surely
    Almost surely
    In probability theory, one says that an event happens almost surely if it happens with probability one. The concept is analogous to the concept of "almost everywhere" in measure theory...

    "

Discrete and continuous random variables

  • Discrete random variables: Probability mass function
    Probability mass function
    In probability theory and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value...

    s
  • Continuous random variables: Probability density function
    Probability density function
    In probability theory, a probability density function , or density of a continuous random variable is a function that describes the relative likelihood for this random variable to occur at a given point. The probability for the random variable to fall within a particular region is given by the...

    s
  • Normalizing constant
    Normalizing constant
    The concept of a normalizing constant arises in probability theory and a variety of other areas of mathematics.-Definition and examples:In probability theory, a normalizing constant is a constant by which an everywhere non-negative function must be multiplied so the area under its graph is 1, e.g.,...

    s
  • Cumulative distribution function
    Cumulative distribution function
    In probability theory and statistics, the cumulative distribution function , or just distribution function, describes the probability that a real-valued random variable X with a given probability distribution will be found at a value less than or equal to x. Intuitively, it is the "area so far"...

    s
  • Joint
    Joint distribution
    In the study of probability, given two random variables X and Y that are defined on the same probability space, the joint distribution for X and Y defines the probability of events defined in terms of both X and Y...

    *, marginal
    Marginal distribution
    In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. The term marginal variable is used to refer to those variables in the subset of variables being retained...

     and conditional
    Conditional distribution
    Given two jointly distributed random variables X and Y, the conditional probability distribution of Y given X is the probability distribution of Y when X is known to be a particular value...

     distributions

Expectation

  • Expectation
    Expected value
    In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...

     (or mean
    Mean
    In statistics, mean has two related meanings:* the arithmetic mean .* the expected value of a random variable, which is also called the population mean....

    ), variance
    Variance
    In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...

     and covariance
    Covariance
    In probability theory and statistics, covariance is a measure of how much two variables change together. Variance is a special case of the covariance when the two variables are identical.- Definition :...

    • Jensen's inequality
      Jensen's inequality
      In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906. Given its generality, the inequality appears in many forms depending on the context,...

  • General moments about the mean
  • Correlated
    Correlation
    In statistics, dependence refers to any statistical relationship between two random variables or two sets of data. Correlation refers to any of a broad class of statistical relationships involving dependence....

     and uncorrelated
    Uncorrelated
    In probability theory and statistics, two real-valued random variables are said to be uncorrelated if their covariance is zero. Uncorrelatedness is by definition pairwise; i.e...

     random variables
  • Conditional expectation
    Conditional expectation
    In probability theory, a conditional expectation is the expected value of a real random variable with respect to a conditional probability distribution....

    :
    • law of total expectation
      Law of total expectation
      The proposition in probability theory known as the law of total expectation, the law of iterated expectations, the tower rule, the smoothing theorem, among other names, states that if X is an integrable random variable The proposition in probability theory known as the law of total expectation, ...

      , law of total variance
      Law of total variance
      In probability theory, the law of total variance or variance decomposition formula states that if X and Y are random variables on the same probability space, and the variance of Y is finite, then...

  • Fatou's lemma
    Fatou's lemma
    In mathematics, Fatou's lemma establishes an inequality relating the integral of the limit inferior of a sequence of functions to the limit inferior of integrals of these functions...

     and the monotone and dominated
    Dominated convergence theorem
    In measure theory, Lebesgue's dominated convergence theorem provides sufficient conditions under which two limit processes commute, namely Lebesgue integration and almost everywhere convergence of a sequence of functions...

     convergence theorems
  • Markov's inequality
    Markov's inequality
    In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant...

     and Chebyshev's inequality
    Chebyshev's inequality
    In probability theory, Chebyshev’s inequality guarantees that in any data sample or probability distribution,"nearly all" values are close to the mean — the precise statement being that no more than 1/k2 of the distribution’s values can be more than k standard deviations away from the mean...


Independence

  • Independent random variables

Some common distributions

  • Discrete:
    • constant (see also degenerate distribution),
    • Bernoulli and binomial,
    • negative binomial
      Negative binomial distribution
      In probability theory and statistics, the negative binomial distribution is a discrete probability distribution of the number of successes in a sequence of Bernoulli trials before a specified number of failures occur...

      ,
    • (discrete) uniform,
    • geometric,
    • Poisson
      Poisson distribution
      In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since...

      , and
    • hypergeometric.
  • Continuous:
    • (continuous) uniform
      Uniform distribution (continuous)
      In probability theory and statistics, the continuous uniform distribution or rectangular distribution is a family of probability distributions such that for each member of the family, all intervals of the same length on the distribution's support are equally probable. The support is defined by...

      ,
    • exponential
      Exponential distribution
      In probability theory and statistics, the exponential distribution is a family of continuous probability distributions. It describes the time between events in a Poisson process, i.e...

      ,
    • gamma,
    • beta,
    • normal (or Gaussian) and multivariate normal,
    • χ-squared (or chi-squared),
    • F-distribution,
    • Student's t-distribution, and
    • Cauchy
      Cauchy distribution
      The Cauchy–Lorentz distribution, named after Augustin Cauchy and Hendrik Lorentz, is a continuous probability distribution. As a probability distribution, it is known as the Cauchy distribution, while among physicists, it is known as the Lorentz distribution, Lorentz function, or Breit–Wigner...

      .

Some other distributions

  • Cantor
  • Fisher-Tippett (or Gumbel)
  • Pareto
  • Benford's law
    Benford's law
    Benford's law, also called the first-digit law, states that in lists of numbers from many real-life sources of data, the leading digit is distributed in a specific, non-uniform way...


Functions of random variables

  • Sums of random variables*
  • General functions of random variables*
  • Borel's paradox
    Borel's paradox
    In probability theory, the Borel–Kolmogorov paradox is a paradox relating to conditional probability with respect to an event of probability zero...


Common generating functions

  • Probability-generating function
    Probability-generating function
    In probability theory, the probability-generating function of a discrete random variable is a power series representation of the probability mass function of the random variable...

    s
  • Moment-generating function
    Moment-generating function
    In probability theory and statistics, the moment-generating function of any random variable is an alternative definition of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or...

    s
  • Laplace transforms and Laplace-Stieltjes transform
    Laplace-Stieltjes transform
    The Laplace–Stieltjes transform, named for Pierre-Simon Laplace and Thomas Joannes Stieltjes, is an integral transform similar to the Laplace transform. For real-valued functions, it is the Laplace transform of a Stieltjes measure, however it is often defined for functions with values in a Banach...

    s
  • Characteristic function
    Characteristic function
    In mathematics, characteristic function can refer to any of several distinct concepts:* The most common and universal usage is as a synonym for indicator function, that is the function* In probability theory, the characteristic function of any probability distribution on the real line is given by...

    s

Applications

  • A proof of the central limit theorem*
  • Random sums of random variables*

Convergence of random variables

(Related topics: convergence
Modes of convergence (annotated index)
The purpose of this article is to serve as an annotated index of various modes of convergence and their logical relationships. For an expository article, see Modes of convergence...

)

Modes of convergence

  • Convergence in distribution and convergence in probability,
  • Convergence in mean, mean square and rth mean
  • Almost sure convergence
  • Skorokhod's representation theorem
    Skorokhod's representation theorem
    In mathematics and statistics, Skorokhod's representation theorem is a result that shows that a weakly convergent sequence of probability measures whose limit measure is sufficiently well-behaved can be represented as the distribution/law of a pointwise convergent sequence of random variables...

    *

Applications

  • Central limit theorem
    Central limit theorem
    In probability theory, the central limit theorem states conditions under which the mean of a sufficiently large number of independent random variables, each with finite mean and variance, will be approximately normally distributed. The central limit theorem has a number of variants. In its common...

     and Laws of large numbers
    Law of large numbers
    In probability theory, the law of large numbers is a theorem that describes the result of performing the same experiment a large number of times...

    • Illustration of the central limit theorem
      Illustration of the central limit theorem
      This article gives two concrete illustrations of the central limit theorem. Both involve the sum of independent and identically-distributed random variables and show how the probability distribution of the sum approaches the normal distribution as the number of terms in the sum increases.The first...

       and a 'concrete' illustration
    • Berry-Esséen theorem
  • Law of the iterated logarithm
    Law of the iterated logarithm
    In probability theory, the law of the iterated logarithm describes the magnitude of the fluctuations of a random walk. The original statement of the law of the iterated logarithm is due to A. Y. Khinchin . Another statement was given by A.N...


Some common stochastic process
Stochastic process
In probability theory, a stochastic process , or sometimes random process, is the counterpart to a deterministic process...

es

  • Random walk
    Random walk
    A random walk, sometimes denoted RW, is a mathematical formalisation of a trajectory that consists of taking successive random steps. For example, the path traced by a molecule as it travels in a liquid or a gas, the search path of a foraging animal, the price of a fluctuating stock and the...

  • Poisson process
    Poisson process
    A Poisson process, named after the French mathematician Siméon-Denis Poisson , is a stochastic process in which events occur continuously and independently of one another...

  • Compound Poisson process
    Compound Poisson process
    A compound Poisson process is a continuous-time stochastic process with jumps. The jumps arrive randomly according to a Poisson process and the size of the jumps is also random, with a specified probability distribution...

  • Wiener process
    Wiener process
    In mathematics, the Wiener process is a continuous-time stochastic process named in honor of Norbert Wiener. It is often called standard Brownian motion, after Robert Brown...

  • Geometric Brownian motion
    Geometric Brownian motion
    A geometric Brownian motion is a continuous-time stochastic process in which the logarithm of the randomly varying quantity follows a Brownian motion, also called a Wiener process...

  • Fractional Brownian motion
  • Brownian bridge
    Brownian bridge
    A Brownian bridge is a continuous-time stochastic process B whose probability distribution is the conditional probability distribution of a Wiener process W given the condition that B = B = 0.The expected value of the bridge is zero, with variance t, implying that the most...

  • Ornstein-Uhlenbeck process
    Ornstein-Uhlenbeck process
    In mathematics, the Ornstein–Uhlenbeck process , is a stochastic process that, roughly speaking, describes the velocity of a massive Brownian particle under the influence of friction...

  • Gamma process

Markov processes

  • Markov property
    Markov property
    In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It was named after the Russian mathematician Andrey Markov....

  • Branching process
    Branching process
    In probability theory, a branching process is a Markov process that models a population in which each individual in generation n produces some random number of individuals in generation n + 1, according to a fixed probability distribution that does not vary from individual to...

    • Galton–Watson process
  • Markov chain
    Markov chain
    A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the...

    • Examples of Markov chains
      Examples of Markov chains
      - Board games played with dice :A game of snakes and ladders or any other game whose moves are determined entirely by dice is a Markov chain, indeed, an absorbing Markov chain. This is in contrast to card games such as blackjack, where the cards represent a 'memory' of the past moves. To see the...

  • Population process
    Population process
    In applied probability, a population process is a Markov chain in which the state of the chain is analogous to the number of individuals in a population , and changes to the state are analogous to the addition or removal of individuals from the population.Although named by analogy to biological...

    es
  • Applications to queueing theory
    Queueing theory
    Queueing theory is the mathematical study of waiting lines, or queues. The theory enables mathematical analysis of several related processes, including arriving at the queue, waiting in the queue , and being served at the front of the queue...


Stochastic differential equations

  • Stochastic calculus
    Stochastic calculus
    Stochastic calculus is a branch of mathematics that operates on stochastic processes. It allows a consistent theory of integration to be defined for integrals of stochastic processes with respect to stochastic processes...

  • Diffusion
    Diffusion
    Molecular diffusion, often called simply diffusion, is the thermal motion of all particles at temperatures above absolute zero. The rate of this movement is a function of temperature, viscosity of the fluid and the size of the particles...

    s
    • Brownian motion
      Brownian motion
      Brownian motion or pedesis is the presumably random drifting of particles suspended in a fluid or the mathematical model used to describe such random movements, which is often called a particle theory.The mathematical model of Brownian motion has several real-world applications...

    • Wiener equation
    • Wiener process
      Wiener process
      In mathematics, the Wiener process is a continuous-time stochastic process named in honor of Norbert Wiener. It is often called standard Brownian motion, after Robert Brown...


Time series
Time series
In statistics, signal processing, econometrics and mathematical finance, a time series is a sequence of data points, measured typically at successive times spaced at uniform time intervals. Examples of time series are the daily closing value of the Dow Jones index or the annual flow volume of the...

  • Moving-average
    Moving average model
    In time series analysis, the moving-average model is a common approach for modeling univariate time series models. The notation MA refers to the moving average model of order q:...

    * and autoregressive* processes
  • Correlation function
    Correlation function
    A correlation function is the correlation between random variables at two different points in space or time, usually as a function of the spatial or temporal distance between the points...

     and autocorrelation
    Autocorrelation
    Autocorrelation is the cross-correlation of a signal with itself. Informally, it is the similarity between observations as a function of the time separation between them...


Martingales
Martingale (probability theory)
In probability theory, a martingale is a model of a fair game where no knowledge of past events can help to predict future winnings. In particular, a martingale is a sequence of random variables for which, at a particular time in the realized sequence, the expectation of the next value in the...

  • Martingale central limit theorem
    Martingale central limit theorem
    In probability theory, the central limit theorem says that, under certain conditions, the sum of many independent identically-distributed random variables, when scaled appropriately, converges in distribution to a standard normal distribution...

    *
  • Azuma's inequality

See also

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK