Law of total cumulance
Encyclopedia
In probability theory
Probability theory
Probability theory is the branch of mathematics concerned with analysis of random phenomena. The central objects of probability theory are random variables, stochastic processes, and events: mathematical abstractions of non-deterministic events or measured quantities that may either be single...

 and mathematical
Mathematics
Mathematics is the study of quantity, space, structure, and change. Mathematicians seek out patterns and formulate new conjectures. Mathematicians resolve the truth or falsity of conjectures by mathematical proofs, which are arguments sufficient to convince other mathematicians of their validity...

 statistics
Statistics
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....

, the law of total cumulance is a generalization to cumulant
Cumulant
In probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have...

s of the law of total probability
Law of total probability
In probability theory, the law of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.-Statement:The law of total probability is the proposition that if \left\...

, the law of total expectation
Law of total expectation
The proposition in probability theory known as the law of total expectation, the law of iterated expectations, the tower rule, the smoothing theorem, among other names, states that if X is an integrable random variable The proposition in probability theory known as the law of total expectation, ...

, and the law of total variance
Law of total variance
In probability theory, the law of total variance or variance decomposition formula states that if X and Y are random variables on the same probability space, and the variance of Y is finite, then...

. It has applications in the analysis of time series
Time series
In statistics, signal processing, econometrics and mathematical finance, a time series is a sequence of data points, measured typically at successive times spaced at uniform time intervals. Examples of time series are the daily closing value of the Dow Jones index or the annual flow volume of the...

. It was introduced by David Brillinger.

It is most transparent when stated in its most general form, for joint cumulants, rather than for cumulants of a specified order for just one random variable
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...

. In general, we have


where
  • κ(X1, ..., Xn) is the joint cumulant of n random variables X1, ..., Xn, and

  • the sum is over all partitions
    Partition of a set
    In mathematics, a partition of a set X is a division of X into non-overlapping and non-empty "parts" or "blocks" or "cells" that cover all of X...

      of the set { 1, ..., n } of indices, and

  • "B ∈ π" means B runs through the whole list of "blocks" of the partition π, and

  • κ(Xi : i ∈ B | Y) is a conditional cumulant given the value of the random variable Y. It is therefore a random variable in its own right—a function of the random variable Y.

The special case of just one random variable and n = 2 or 3

Only in case n = either 2 or 3 is the nth cumulant the same as the nth central moment
Central moment
In probability theory and statistics, central moments form one set of values by which the properties of a probability distribution can be usefully characterised...

. The case n = 2 is well-known (see law of total variance
Law of total variance
In probability theory, the law of total variance or variance decomposition formula states that if X and Y are random variables on the same probability space, and the variance of Y is finite, then...

). Below is the case n = 3. The notation μ3 means the third central moment.

General 4th-order joint cumulants

For general 4th-order cumulants, the rule gives a sum of 15 terms, as follows:





Cumulants of compound Poisson random variables

Suppose Y has a Poisson distribution
Poisson distribution
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since...

 with expected value
Expected value
In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...

 1, and X is the sum of Y independent
Statistical independence
In probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other occurs...

 copies of W.


All of the cumulants of the Poisson distribution are equal to each other, and so in this case are equal to 1. Also recall that if random variables W1, ..., Wm are independent
Statistical independence
In probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other occurs...

, then the nth cumulant is additive:


We will find the 4th cumulant of X. We have:








(the punch line—see the explanation below).


We recognize this last sum as the sum over all partitions of the set { 1, 2, 3, 4 }, of the product over all blocks of the partition, of cumulants of W of order equal to the size of the block. That is precisely the 4th raw moment
Moment (mathematics)
In mathematics, a moment is, loosely speaking, a quantitative measure of the shape of a set of points. The "second moment", for example, is widely used and measures the "width" of a set of points in one dimension or in higher dimensions measures the shape of a cloud of points as it could be fit by...

 of W (see cumulant
Cumulant
In probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have...

 for a more leisurely discussion of this fact). Hence the moments of W are the cumulants of X.

In this way we see that every moment sequence is also a cumulant sequence (the converse cannot be true, since cumulants of even order ≥ 4 are in some cases negative, and also because the cumulant sequence of the normal distribution is not a moment sequence of any probability distribution).

Conditioning on a Bernoulli random variable

Suppose Y = 1 with probability p and Y = 0 with probability q = 1 − p. Suppose the conditional probability distribution of X given Y is F if Y = 1 and G if Y = 0. Then we have


where means π is a partition of the set { 1, ..., n } that is finer than the coarsest partition – the sum is over all partitions except that one. For example, if n = 3, then we have
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK