Conditional entropy
Encyclopedia
In information theory
, the conditional entropy (or equivocation) quantifies the remaining entropy
(i.e. uncertainty) of a random variable
given that the value of another random variable is known. It is referred to as the entropy of conditional on , and is written . Like other entropies, the conditional entropy is measured in bit
s, nat
s, or ban
s.
and with support , the conditional entropy of given is defined as: NEWLINE
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...
, the conditional entropy (or equivocation) quantifies the remaining entropy
Information entropy
In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...
(i.e. uncertainty) of a random variable
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
given that the value of another random variable is known. It is referred to as the entropy of conditional on , and is written . Like other entropies, the conditional entropy is measured in bit
Bit
A bit is the basic unit of information in computing and telecommunications; it is the amount of information stored by a digital device or other physical system that exists in one of two possible distinct states...
s, nat
Nat (information)
A nat is a logarithmic unit of information or entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms which define the bit. The nat is the natural unit for information entropy...
s, or ban
Ban (information)
A ban, sometimes called a hartley or a dit , is a logarithmic unit which measures information or entropy, based on base 10 logarithms and powers of 10, rather than the powers of 2 and base 2 logarithms which define the bit. As a bit corresponds to a binary digit, so a ban is a decimal digit...
s.
Definition
More precisely, if is the entropy of the variable conditional on the variable taking a certain value , then is the result of averaging over all possible values that may take. Given discrete random variable with supportSupport (mathematics)
In mathematics, the support of a function is the set of points where the function is not zero, or the closure of that set . This concept is used very widely in mathematical analysis...
and with support , the conditional entropy of given is defined as: NEWLINE
- NEWLINE
- NEWLINE
- NEWLINE