![](http://image.absoluteastronomy.com/images//topicimages/noimage.gif)
Arbitrarily varying channel
Encyclopedia
An arbitrarily varying channel (AVC) is a communication channel model used in coding theory
, and was first introduced by Blackwell, Breiman, and Thomasian. This particular channel has unknown parameters that can change over time and these changes may not have a uniform pattern during the transmission of a codeword.
uses of this channel can be described using a stochastic matrix
, where
is the input alphabet,
is the output alphabet, and
is the probability over a given set of states
, that the transmitted input
is equal to the received output
. The state
in set
can vary arbitrarily at each time unit
. This channel was developed as an alternative to Shannon's Binary Symmetric Channel
(BSC), where the entire nature of the channel is known, to be more realistic to actual network channel situations.
can vary depending on the certain parameters.
is an achievable rate for a deterministic AVC code if it is larger than
, and if for every positive
and
, and very large
, length-
block code
s exist that satisfy the following equations:
and
, where
is the highest value in
and where
is the average probability of error for a state sequence
. The largest rate
represents the capacity
of the AVC, denoted by
.
As you can see, the only useful situations are when the capacity
of the AVC is greater than
, because then the channel can transmit a guaranteed amount of data
without errors. So we start out with a theorem
that shows when
is positive in a AVC and the theorem
s discussed afterward will narrow down the range
of
for different circumstances.
Before stating Theorem 1, a few definitions need to be addressed:
Theorem 1:
if and only if the AVC is not symmetric. If
, then
.
Proof of 1st part for symmetry: If we can prove that
is positive when the AVC is not symmetric, and then prove that
, we will be able to prove Theorem 1. Assume
were equal to
. From the definition of
, this would make
and
independent random variable
s, for some
, because this would mean that neither random variable
's entropy
would rely on the other random variable
's value. By using equation
, (and remembering
,) we can get,
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-83.gif)
since
and
are independent random variable
s,
for some ![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-88.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-89.gif)
because only
depends on
now![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-93.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-94.gif)
because ![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-96.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-97.gif)
So now we have a probability distribution
on
that is independent of
. So now the definition of a symmetric AVC can be rewritten as follows:
since
and
are both functions based on
, they have been replaced with functions based on
and
only. As you can see, both sides are now equal to the
we calculated earlier, so the AVC is indeed symmetric when
is equal to
. Therefore
can only be positive if the AVC is not symmetric.
Proof of second part for capacity: See the paper "The capacity of the arbitrarily varying channel revisited: positivity, constraints," referenced below for full proof.
will deal with the capacity
for AVCs with input and/or state constraints. These constraints help to decrease the very large range
of possibilities for transmission and error on an AVC, making it a bit easier to see how the AVC behaves.
Before we go on to Theorem 2, we need to define a few definitions and lemmas
:
For such AVCs, there exists:
Assume
is a given non-negative-valued function on
and
is a given non-negative-valued function on
and that the minimum values for both is
. In the literature I have read on this subject, the exact definitions of both
and
(for one variable
,) is never described formally. The usefulness of the input constraint
and the state constraint
will be based on these equations.
For AVCs with input and/or state constraints, the rate
is now limited to codewords of format
that satisfy
, and now the state
is limited to all states that satisfy
. The largest rate is still considered the capacity
of the AVC, and is now denoted as
.
Lemma 1: Any codes where
is greater than
cannot be considered "good" codes, because those kinds of codes have a maximum average probability of error greater than or equal to
, where
is the maximum value of
. This isn't a good maximum average error probability because it is fairly large,
is close to
, and the other part of the equation will be very small since the
value is squared, and
is set to be larger than
. Therefore it would be very unlikely to receive a codeword without error. This is why the
condition is present in Theorem 2.
Theorem 2: Given a positive
and arbitrarily small
,
,
, for any block length
and for any type
with conditions
and
, and where
, there exists a code with codewords
, each of type
, that satisfy the following equations:
,
, and where positive
and
depend only on
,
,
, and the given AVC.
Proof of Theorem 2: See the paper "The capacity of the arbitrarily varying channel revisited: positivity, constraints," referenced below for full proof.
will be for AVCs with randomized
code. For such AVCs the code is a random variable
with values from a family of length-n block code
s, and these codes are not allowed to depend/rely on the actual value of the codeword. These codes have the same maximum and average error probability value for any channel because of its random nature. These types of codes also help to make certain properties of the AVC more clear.
Before we go on to Theorem 3, we need to define a couple important terms first:
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-170.gif)
is very similar to the
equation mentioned previously,
, but now the pmf
is added to the equation, making the minimum of
based a new form of
, where
replaces
.
Theorem 3: The capacity
for randomized
codes of the AVC is
.
Proof of Theorem 3: See paper "The Capacities of Certain Channel Classes Under Random Coding" referenced below for full proof.
Coding theory
Coding theory is the study of the properties of codes and their fitness for a specific application. Codes are used for data compression, cryptography, error-correction and more recently also for network coding...
, and was first introduced by Blackwell, Breiman, and Thomasian. This particular channel has unknown parameters that can change over time and these changes may not have a uniform pattern during the transmission of a codeword.
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-1.gif)
Stochastic matrix
In mathematics, a stochastic matrix is a matrix used to describe the transitions of a Markov chain. It has found use in probability theory, statistics and linear algebra, as well as computer science...
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-2.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-3.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-4.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-5.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-6.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-7.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-8.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-9.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-10.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-11.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-12.gif)
Binary symmetric channel
A binary symmetric channel is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit , and the receiver receives a bit. It is assumed that the bit is usually transmitted correctly, but that it will be "flipped" with a...
(BSC), where the entire nature of the channel is known, to be more realistic to actual network channel situations.
Capacity of deterministic AVCs
An AVC's capacityChannel capacity
In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel...
can vary depending on the certain parameters.
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-13.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-14.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-15.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-16.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-17.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-18.gif)
Block code
In coding theory, block codes refers to the large and important family of error-correcting codes that encode data in blocks.There is a vast number of examples for block codes, many of which have a wide range of practical applications...
s exist that satisfy the following equations:
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-19.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-20.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-21.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-22.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-23.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-24.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-25.gif)
Channel capacity
In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel...
of the AVC, denoted by
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-26.gif)
As you can see, the only useful situations are when the capacity
Channel capacity
In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel...
of the AVC is greater than
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-27.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-28.gif)
Theorem
In mathematics, a theorem is a statement that has been proven on the basis of previously established statements, such as other theorems, and previously accepted statements, such as axioms...
that shows when
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-29.gif)
Theorem
In mathematics, a theorem is a statement that has been proven on the basis of previously established statements, such as other theorems, and previously accepted statements, such as axioms...
s discussed afterward will narrow down the range
Range (mathematics)
In mathematics, the range of a function refers to either the codomain or the image of the function, depending upon usage. This ambiguity is illustrated by the function f that maps real numbers to real numbers with f = x^2. Some books say that range of this function is its codomain, the set of all...
of
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-30.gif)
Before stating Theorem 1, a few definitions need to be addressed:
- An AVC is symmetric if
for every
, where
,
, and
is a channel function
.
-
,
, and
are all random variable
Random variableIn probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
s in sets,
, and
respectively.
-
is equal to the probability that the random variable
Random variableIn probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
is equal to
.
-
is equal to the probability that the random variable
Random variableIn probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
is equal to
.
-
is the combined probability mass function
Probability mass functionIn probability theory and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value...
(pmf) of,
, and
.
is defined formally as
.
-
is the entropy
Information entropyIn information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...
of.
-
is equal to the average probability that
will be a certain value based on all the values
could possibly be equal to.
-
is the mutual information
Conditional entropyIn information theory, the conditional entropy quantifies the remaining entropy of a random variable Y given that the value of another random variable X is known. It is referred to as the entropy of Y conditional on X, and is written H...
ofand
, and is equal to
.
-
, where the minimum is over all random variables
such that
,
, and
are distributed in the form of
.
Theorem 1:
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-70.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-71.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-72.gif)
Proof of 1st part for symmetry: If we can prove that
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-73.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-74.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-75.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-76.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-77.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-78.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-79.gif)
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
s, for some
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-80.gif)
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
's entropy
Information entropy
In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...
would rely on the other random variable
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
's value. By using equation
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-81.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-82.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-83.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-84.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-85.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-86.gif)
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
s,
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-87.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-88.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-89.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-90.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-91.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-92.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-93.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-94.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-95.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-96.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-97.gif)
So now we have a probability distribution
Probability distribution
In probability theory, a probability mass, probability density, or probability distribution is a function that describes the probability of a random variable taking certain values....
on
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-98.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-99.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-100.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-101.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-102.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-103.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-104.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-105.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-106.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-107.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-108.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-109.gif)
Proof of second part for capacity: See the paper "The capacity of the arbitrarily varying channel revisited: positivity, constraints," referenced below for full proof.
Capacity of AVCs with input and state constraints
The next theoremTheorem
In mathematics, a theorem is a statement that has been proven on the basis of previously established statements, such as other theorems, and previously accepted statements, such as axioms...
will deal with the capacity
Channel capacity
In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel...
for AVCs with input and/or state constraints. These constraints help to decrease the very large range
Range (mathematics)
In mathematics, the range of a function refers to either the codomain or the image of the function, depending upon usage. This ambiguity is illustrated by the function f that maps real numbers to real numbers with f = x^2. Some books say that range of this function is its codomain, the set of all...
of possibilities for transmission and error on an AVC, making it a bit easier to see how the AVC behaves.
Before we go on to Theorem 2, we need to define a few definitions and lemmas
Lemma (mathematics)
In mathematics, a lemma is a proven proposition which is used as a stepping stone to a larger result rather than as a statement in-and-of itself...
:
For such AVCs, there exists:
- - An input constraint
based on the equation
, where
and
.
- - A state constraint
, based on the equation
, where
and
.
- -
- -
is very similar to
equation mentioned previously,
, but now any state
or
in the equation must follow the
state restriction.
Assume
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-125.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-126.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-127.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-128.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-129.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-130.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-131.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-132.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-133.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-134.gif)
For AVCs with input and/or state constraints, the rate
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-135.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-136.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-137.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-138.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-139.gif)
Channel capacity
In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel...
of the AVC, and is now denoted as
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-140.gif)
Lemma 1: Any codes where
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-141.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-142.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-143.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-144.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-145.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-146.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-147.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-148.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-149.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-150.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-151.gif)
Theorem 2: Given a positive
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-152.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-153.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-154.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-155.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-156.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-157.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-158.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-159.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-160.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-161.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-162.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-163.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-164.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-165.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-166.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-167.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-168.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-169.gif)
Proof of Theorem 2: See the paper "The capacity of the arbitrarily varying channel revisited: positivity, constraints," referenced below for full proof.
Capacity of randomized AVCs
The next theoremTheorem
In mathematics, a theorem is a statement that has been proven on the basis of previously established statements, such as other theorems, and previously accepted statements, such as axioms...
will be for AVCs with randomized
Information entropy
In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...
code. For such AVCs the code is a random variable
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
with values from a family of length-n block code
Block code
In coding theory, block codes refers to the large and important family of error-correcting codes that encode data in blocks.There is a vast number of examples for block codes, many of which have a wide range of practical applications...
s, and these codes are not allowed to depend/rely on the actual value of the codeword. These codes have the same maximum and average error probability value for any channel because of its random nature. These types of codes also help to make certain properties of the AVC more clear.
Before we go on to Theorem 3, we need to define a couple important terms first:
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-170.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-171.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-172.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-173.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-174.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-175.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-176.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-177.gif)
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-178.gif)
Theorem 3: The capacity
Channel capacity
In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel...
for randomized
Information entropy
In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...
codes of the AVC is
![](http://image.absoluteastronomy.com/images/formulas/1/7/5179906-179.gif)
Proof of Theorem 3: See paper "The Capacities of Certain Channel Classes Under Random Coding" referenced below for full proof.
See also
- Binary symmetric channelBinary symmetric channelA binary symmetric channel is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit , and the receiver receives a bit. It is assumed that the bit is usually transmitted correctly, but that it will be "flipped" with a...
- Binary erasure channelBinary erasure channelA binary erasure channel is a common communications channel model used in coding theory and information theory. In this model, a transmitter sends a bit , and the receiver either receives the bit or it receives a message that the bit was not received...
- Z-channel (information theory)Z-channel (information theory)A Z-channel is a communications channel used in coding theory and information theory to model the behaviour of some data storage systems.- Definition :...
- Channel model
- Information theoryInformation theoryInformation theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...
- Coding theoryCoding theoryCoding theory is the study of the properties of codes and their fitness for a specific application. Codes are used for data compression, cryptography, error-correction and more recently also for network coding...