Sequential decoding
Encyclopedia
Sequential decoding is a limited memory technique for decoding tree codes. Sequential decoding is mainly used is as an approximate decoding algorithm for long constraint-length convolutional code
s. This approach may not be as accurate as the Viterbi algorithm
but can save a substantial amount of computer memory.
Sequential decoding explores the tree code in such a way to try and minimise the computational cost and memory requirements to store the tree.
There is a range of sequential decoding approaches based on choice of metric and algorithm. Metrics include:
Algorithms include:
) allows one to calculate from which is the best node to explore further. This metric is optimal given no other constraints (e.g. memory).
For a binary symmetric channel
(with error probability ) the Fano metric can be derived via Bayes theorem. We are interested in following the most likely path given an explored state of the tree and a received sequence . Using the language of probability
and Bayes theorem we want to choose the maximum over of:
We now introduce the following notation:
We express the likelihood
as (by using the binary symmetric channel likelihood for the first bits followed by a uniform prior over the remaining bits).
We express the prior
in terms of the number of branch choices one has made, , and the number of branches from each node, .
Therefore:
We can equivalently maximise the log of this probability, i.e.
This last expression is the Fano metric. The important point to see is that we have two terms here: one based on the number of wrong bits and one based on the number of right bits. We can therefore update the Fano metric simply by adding for each non-matching bit and for each matching bit.
, may be more suitable). For a particular noise level there is a maximum coding rate called the computational cutoff rate where there is a finite backtracking limit. For the binary symmetric channel:
) has a very low memory requirement and hence is suited to hardware implementations. This algorithm explores backwards and forward from a single point on the tree.
Convolutional code
In telecommunication, a convolutional code is a type of error-correcting code in which* each m-bit information symbol to be encoded is transformed into an n-bit symbol, where m/n is the code rate and...
s. This approach may not be as accurate as the Viterbi algorithm
Viterbi algorithm
The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states – called the Viterbi path – that results in a sequence of observed events, especially in the context of Markov information sources, and more generally, hidden Markov models...
but can save a substantial amount of computer memory.
Sequential decoding explores the tree code in such a way to try and minimise the computational cost and memory requirements to store the tree.
There is a range of sequential decoding approaches based on choice of metric and algorithm. Metrics include:
- Fano metric
- Zigangirov metric
- Gallager metric
Algorithms include:
- Stack algorithm
- Fano algorithm
- Creeper algorithm
Fano metric
Given a partially explored tree (represented by a set of nodes which are limit of exploration), we would like to know the best node from which to explore further. The Fano metric (named after Robert FanoRobert Fano
Robert Mario Fano is an Italian-American computer scientist, currently professor emeritus of Electrical Engineering and Computer Science at Massachusetts Institute of Technology. Fano is known principally for his work on information theory, inventing Shannon-Fano coding...
) allows one to calculate from which is the best node to explore further. This metric is optimal given no other constraints (e.g. memory).
For a binary symmetric channel
Binary symmetric channel
A binary symmetric channel is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit , and the receiver receives a bit. It is assumed that the bit is usually transmitted correctly, but that it will be "flipped" with a...
(with error probability ) the Fano metric can be derived via Bayes theorem. We are interested in following the most likely path given an explored state of the tree and a received sequence . Using the language of probability
Probability
Probability is ordinarily used to describe an attitude of mind towards some proposition of whose truth we arenot certain. The proposition of interest is usually of the form "Will a specific event occur?" The attitude of mind is of the form "How certain are we that the event will occur?" The...
and Bayes theorem we want to choose the maximum over of:
We now introduce the following notation:
- to represent the maximum length of transmission in branches
- to represent the number of bits on a branch of the code (the denominator of the code rateCode rateIn telecommunication and information theory, the code rate of a forward error correction code is the proportion of the data-stream that is useful...
, ). - to represent the number of bit errors on path (the Hamming distanceHamming distanceIn information theory, the Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different...
between the branch labels and the received sequence) - to be the length of in branches.
We express the likelihood
Likelihood
Likelihood is a measure of how likely an event is, and can be expressed in terms of, for example, probability or odds in favor.-Likelihood function:...
as (by using the binary symmetric channel likelihood for the first bits followed by a uniform prior over the remaining bits).
We express the prior
Prior probability
In Bayesian statistical inference, a prior probability distribution, often called simply the prior, of an uncertain quantity p is the probability distribution that would express one's uncertainty about p before the "data"...
in terms of the number of branch choices one has made, , and the number of branches from each node, .
Therefore:
We can equivalently maximise the log of this probability, i.e.
This last expression is the Fano metric. The important point to see is that we have two terms here: one based on the number of wrong bits and one based on the number of right bits. We can therefore update the Fano metric simply by adding for each non-matching bit and for each matching bit.
Computational cutoff rate
For sequential decoding to a good choice of decoding algorithm, the number of states explored wants to remain small (otherwise an algorithm which deliberately explores all states, e.g. the Viterbi algorithmViterbi algorithm
The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states – called the Viterbi path – that results in a sequence of observed events, especially in the context of Markov information sources, and more generally, hidden Markov models...
, may be more suitable). For a particular noise level there is a maximum coding rate called the computational cutoff rate where there is a finite backtracking limit. For the binary symmetric channel:
Stack algorithm
The simplest algorithm to describe is the "stack algorithm" in which the best paths found so far are stored. Sequential decoding may introduce an additional error above Viterbi decoding when the correct path has or more highly scoring paths above it; at this point the best path will drop off the stack and be no longer considered.Fano algorithm
The famous Fano algorithm (named after Robert FanoRobert Fano
Robert Mario Fano is an Italian-American computer scientist, currently professor emeritus of Electrical Engineering and Computer Science at Massachusetts Institute of Technology. Fano is known principally for his work on information theory, inventing Shannon-Fano coding...
) has a very low memory requirement and hence is suited to hardware implementations. This algorithm explores backwards and forward from a single point on the tree.
External links
- "Correction trees" - simulator of correction process using priority queue to choose maximum metric node (called weight)