LT codes
Encyclopedia
In computer science
Computer science
Computer science or computing science is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems...

, Luby transform codes (LT codes) are the first class of practical fountain code
Fountain code
In coding theory, fountain codes are a class of erasure codes with the property that a potentially limitless sequence of encoding symbols can be generated from a given set of source symbols such that the original source symbols can ideally be recovered from any subset of the encoding symbols of...

s that are near optimal erasure correcting codes invented by Michael Luby
Michael Luby
Michael George Luby is a mathematician and computer scientist, VP Technology at Qualcomm and former co-founder and Chief Technology Officer of Digital Fountain. In coding theory he is known for leading the invention of the Tornado codes and the LT codes...

 in 1998 and published in 2002. Like some other fountain code
Fountain code
In coding theory, fountain codes are a class of erasure codes with the property that a potentially limitless sequence of encoding symbols can be generated from a given set of source symbols such that the original source symbols can ideally be recovered from any subset of the encoding symbols of...

s, LT codes depend on sparse bipartite graph
Bipartite graph
In the mathematical field of graph theory, a bipartite graph is a graph whose vertices can be divided into two disjoint sets U and V such that every edge connects a vertex in U to one in V; that is, U and V are independent sets...

s to trade reception overhead for encoding and decoding speed. The distinguishing characteristic of LT codes is in employing a particularly simple algorithm based on the exclusive or operation () to encode and decode the message.

LT codes are rateless because the encoding algorithm can in principle produce an infinite number of message packets (i.e., the percentage of packets that must be received to decode the message can be arbitrarily small). They are erasure correcting codes because they can be used to transmit digital data reliably on an erasure channel
Binary erasure channel
A binary erasure channel is a common communications channel model used in coding theory and information theory. In this model, a transmitter sends a bit , and the receiver either receives the bit or it receives a message that the bit was not received...

.

The next generation beyond LT codes is Raptor codes
Raptor codes
In computer science, raptor codes are the first known class of fountain codes with linear time encoding and decoding. They were invented by Amin Shokrollahi in 2000/2001 and were first published in 2004 as an extended abstract...

 (see for example IETF RFC 5053), which have linear time encoding and decoding. Raptor codes use two encoding stages for encoding, where the second stage is an LT encoding.

Why use an LT code?

The traditional scheme for transferring data across an erasure channel depends on continuous two-way communication.
  • The sender encodes and sends a packet of information.
  • The receiver attempts to decode the received packet. If it can be decoded, the receiver sends an acknowledgment back to the transmitter. Otherwise, the receiver asks the transmitter to send the packet again.
  • This two-way process continues until all the packets in the message have been transferred successfully.


Certain networks, such as ones used for cellular wireless broadcasting, do not have a feedback channel. Applications on these networks still require reliability. Fountain code
Fountain code
In coding theory, fountain codes are a class of erasure codes with the property that a potentially limitless sequence of encoding symbols can be generated from a given set of source symbols such that the original source symbols can ideally be recovered from any subset of the encoding symbols of...

s in general, and LT codes in particular, get around this problem by adopting an essentially one-way communication protocol.
  • The sender encodes and sends packet after packet of information.
  • The receiver evaluates each packet as it is received. If there is an error, the erroneous packet is discarded. Otherwise the packet is saved as a piece of the message.
  • Eventually the receiver has enough valid packets to reconstruct the entire message. When the entire message has been received successfully the receiver signals that transmission is complete.

LT encoding

The encoding process begins by dividing the uncoded message into n blocks of roughly equal length. Encoded packets are then produced with the help of a pseudorandom number generator
Pseudorandom number generator
A pseudorandom number generator , also known as a deterministic random bit generator , is an algorithm for generating a sequence of numbers that approximates the properties of random numbers...

.
  • The degree d, 1 ≤ d ≤ n, of the next packet is chosen at random.
  • Exactly d blocks from the message are randomly chosen.
  • If Mi is the ith block of the message, the data portion of the next packet is computed as


where {i1i2, …, id} are the randomly chosen indices for the d blocks included in this packet.
  • A prefix is appended to the encoded packet, defining how many blocks n are in the message, how many blocks d have been exclusive-ored into the data portion of this packet, and the list of indices {i1i2, …, id}.
  • Finally, some form of error-detecting code (perhaps as simple as a cyclic redundancy check
    Cyclic redundancy check
    A cyclic redundancy check is an error-detecting code commonly used in digital networks and storage devices to detect accidental changes to raw data...

    ) is applied to the packet, and the packet is transmitted.


This process continues until the receiver signals that the message has been received and successfully decoded.

LT decoding

The decoding process uses the "exclusive or" operation to retrieve the encoded message.
  • If the current packet isn't clean, or if it replicates a packet that has already been processed, the current packet is discarded.
  • If the current cleanly received packet is of degree d > 1, it is first processed against all the fully decoded blocks in the message queuing area (as described more fully in the next step), then stored in a buffer area if its reduced degree is greater than 1.
  • When a new, clean packet of degree d = 1 (block Mi) is received (or the degree of the current packet is reduced to 1 by the preceding step), it is moved to the message queueing area, and then matched against all the packets of degree d > 1 residing in the buffer. It is exclusive-ored into the data portion of any buffered packet that was encoded using Mi, the degree of that matching packet is decremented, and the list of indices for that packet is adjusted to reflect the application of Mi.
  • When this process unlocks a block of degree d = 2 in the buffer, that block is reduced to degree 1 and is in its turn moved to the message queueing area, and then processed against the packets remaining in the buffer.
  • When all n blocks of the message have been moved to the message queueing area, the receiver signals the transmitter that the message has been successfully decoded.


This decoding procedure works because A  A = 0 for any bit string A. After d − 1 distinct blocks have been exclusive-ored into a packet of degree d, the original unencoded content of the unmatched block is all that remains. In symbols we have

Variations

Several variations of the encoding and decoding processes described above are possible. For instance, instead of prefixing each packet with a list of the actual message block indices {i1i2, …, id}, the encoder might simply send a short "key" which served as the seed for the pseudorandom number generator
Pseudorandom number generator
A pseudorandom number generator , also known as a deterministic random bit generator , is an algorithm for generating a sequence of numbers that approximates the properties of random numbers...

 (PRNG) or index table used to construct the list of indices. Since the receiver equipped with the same RNG or index table can reliably recreate the "random" list of indices from this seed, the decoding process can be completed successfully. Alternatively, by combining a simple LT code of low average degree with a robust error-correcting code, a raptor code can be constructed that will outperform an optimized LT code in practice.

Optimization of LT codes

There is only one parameter that can be used to optimize a straight LT code: the degree distribution function (described as a pseudorandom number generator for the degree d in the LT encoding section above). In practice the other "random" numbers (the list of indices { i1i2, …, id } ) are invariably taken from a uniform distribution on (0, n], where n is the number of blocks into which the message has been divided.

Luby himself discussed the "ideal soliton distribution
Soliton distribution
A soliton distribution is a type of discrete probability distribution that arises in the theory of erasure correcting codes. A paper by Luby introduced two forms of such distributions, the ideal soliton distribution and the robust soliton distribution.-Ideal distribution:The ideal soliton...

" defined by


This degree distribution theoretically minimizes the expected number of redundant code words that will be sent before the decoding process can be completed. However the ideal soliton distribution does not work well in practice because any fluctuation around the expected behavior makes it likely that at some step in the decoding process there will be no available packet of (reduced) degree 1 so decoding will fail. Furthermore, some of the original blocks will not be xor-ed into any of the transmission packets. Therefore, in practice, a modified distribution, the "robust soliton distribution
Soliton distribution
A soliton distribution is a type of discrete probability distribution that arises in the theory of erasure correcting codes. A paper by Luby introduced two forms of such distributions, the ideal soliton distribution and the robust soliton distribution.-Ideal distribution:The ideal soliton...

", is substituted for the ideal distribution. The effect of the modification is, generally, to produce more packets of very small degree (around 1) and fewer packets of degree greater than 1, except for a spike of packets at a fairly large quantity chosen to ensure that all original blocks will be included in some packet.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK