Timeline of information theory
Encyclopedia
A timeline of events related to  information theory
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...

,  quantum information theory,  data compression
Data compression
In computer science and information theory, data compression, source coding or bit-rate reduction is the process of encoding information using fewer bits than the original representation would use....

,  error correcting codes and related subjects.
  • 1872 Ludwig Boltzmann
    Ludwig Boltzmann
    Ludwig Eduard Boltzmann was an Austrian physicist famous for his founding contributions in the fields of statistical mechanics and statistical thermodynamics...

     presents his H-theorem
    H-theorem
    In Classical Statistical Mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the increase in the entropy of an ideal gas in an irreversible process. H-theorem follows from considerations of Boltzmann's equation...

    , and with it the formula Σpi log pi for the entropy of a single gas particle.
  • 1878 – J. Willard Gibbs defines the Gibbs entropy: the probabilities in the entropy formula are now taken as probabilities of the state of the whole system.
  • 1924 Harry Nyquist
    Harry Nyquist
    Harry Nyquist was an important contributor to information theory.-Personal life:...

     discusses quantifying "intelligence" and the speed at which it can be transmitted by a communication system.
  • 1927 John von Neumann
    John von Neumann
    John von Neumann was a Hungarian-American mathematician and polymath who made major contributions to a vast number of fields, including set theory, functional analysis, quantum mechanics, ergodic theory, geometry, fluid dynamics, economics and game theory, computer science, numerical analysis,...

     defines the von Neumann entropy
    Von Neumann entropy
    In quantum statistical mechanics, von Neumann entropy, named after John von Neumann, is the extension of classical entropy concepts to the field of quantum mechanics....

    , extending the Gibbs entropy to quantum mechanics.
  • 1928 Ralph Hartley
    Ralph Hartley
    Ralph Vinton Lyon Hartley was an electronics researcher. He invented the Hartley oscillator and the Hartley transform, and contributed to the foundations of information theory.-Biography:...

     introduces Hartley information as the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning).
  • 1929 Leó Szilárd
    Leó Szilárd
    Leó Szilárd was an Austro-Hungarian physicist and inventor who conceived the nuclear chain reaction in 1933, patented the idea of a nuclear reactor with Enrico Fermi, and in late 1939 wrote the letter for Albert Einstein's signature that resulted in the Manhattan Project that built the atomic bomb...

     analyses Maxwell's Demon
    Maxwell's demon
    In the philosophy of thermal and statistical physics, Maxwell's demon is a thought experiment created by the Scottish physicist James Clerk Maxwell to "show that the Second Law of Thermodynamics has only a statistical certainty." It demonstrates Maxwell's point by hypothetically describing how to...

    , showing how a Szilard engine can sometimes transform information into the extraction of useful work.
  • 1940 Alan Turing
    Alan Turing
    Alan Mathison Turing, OBE, FRS , was an English mathematician, logician, cryptanalyst, and computer scientist. He was highly influential in the development of computer science, providing a formalisation of the concepts of "algorithm" and "computation" with the Turing machine, which played a...

     introduces the deciban as a measure of information inferred about the German Enigma machine
    Enigma machine
    An Enigma machine is any of a family of related electro-mechanical rotor cipher machines used for the encryption and decryption of secret messages. Enigma was invented by German engineer Arthur Scherbius at the end of World War I...

     cypher settings by the Banburismus
    Banburismus
    Banburismus was a cryptanalytic process developed by Alan Turing at Bletchley Park in England during the Second World War. It was used by Bletchley Park's Hut 8 to help break German Kriegsmarine messages enciphered on Enigma machines. The process used sequential conditional probability to infer...

     process.
  • 1944 – Claude Shannon's theory of information is substantially complete.
  • 1947 – Richard W. Hamming invents Hamming code
    Hamming code
    In telecommunication, Hamming codes are a family of linear error-correcting codes that generalize the Hamming-code invented by Richard Hamming in 1950. Hamming codes can detect up to two and correct up to one bit errors. By contrast, the simple parity code cannot correct errors, and can detect only...

    s for error detection and correction. For patent reasons, the result is not published until 1950.
  • 1948 – Claude E. Shannon publishes A Mathematical Theory of Communication
    A Mathematical Theory of Communication
    "A Mathematical Theory of Communication" is an influential 1948 article by mathematician Claude E. Shannon. As of November 2011, Google Scholar has listed more than 48,000 unique citations of the article and the later-published book version...

  • 1949 – Claude E. Shannon publishes Communication in the Presence of NoiseNyquist–Shannon sampling theorem
    Nyquist–Shannon sampling theorem
    The Nyquist–Shannon sampling theorem, after Harry Nyquist and Claude Shannon, is a fundamental result in the field of information theory, in particular telecommunications and signal processing. Sampling is the process of converting a signal into a numeric sequence...

     and Shannon–Hartley law
  • 1949 – Claude E. Shannon's Communication Theory of Secrecy Systems
    Communication Theory of Secrecy Systems
    Communication Theory of Secrecy Systems is a paper published in 1949 by Claude Shannon discussing cryptography from the viewpoint of information theory. It is one of the foundational treatments of modern cryptography...

    is declassified
  • 1949 – Robert M. Fano publishes Transmission of Information. M.I.T. Press, Cambridge, Mass. – Shannon–Fano coding
    Shannon–Fano coding
    In the field of data compression, Shannon–Fano coding, named after Claude Elwood Shannon and Robert Fano, is a technique for constructing a prefix code based on a set of symbols and their probabilities...

  • 1949 – Leon G. Kraft discovers Kraft's inequality
    Kraft's inequality
    In coding theory, Kraft's inequality, named after Leon Kraft, gives a necessary and sufficient condition for the existence of a uniquely decodable code for a given set of codeword lengths...

    , which shows the limits of prefix codes
  • 1949 Marcel J. E. Golay
    Marcel J. E. Golay
    Marcel J.E. Golay was a Swiss-born mathematician, physicist, and information theorist, who applied mathematics to real-world military and industrial problems. He was born in Neuchâtel, Switzerland.-Career:...

     introduces Golay codes for forward error correction
    Forward error correction
    In telecommunication, information theory, and coding theory, forward error correction or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels....

  • 1951 Solomon Kullback
    Solomon Kullback
    Solomon Kullback was an American cryptanalyst and mathematician, who was one of the first three employees hired by William F. Friedman at the US Army's Signal Intelligence Service in the 1930s, along with Frank Rowlett and Abraham Sinkov. He went on to a long and distinguished career at SIS and...

     and Richard Leibler
    Richard Leibler
    Richard Leibler was an American mathematician and cryptanalyst. Richard Leibler was born in March 1914. He received his A.M. in mathematics from Northwestern University and his Ph.D. from the University of Illinois in 1939...

     introduce the Kullback–Leibler divergence
    Kullback–Leibler divergence
    In probability theory and information theory, the Kullback–Leibler divergence is a non-symmetric measure of the difference between two probability distributions P and Q...

  • 1951 David A. Huffman
    David A. Huffman
    David Albert Huffman was a pioneer in computer science. He is well-known for his Huffman coding. David Huffman died at the age of 74 after a 10-month battle with cancer.-Education:...

     invents Huffman encoding, a method of finding optimal prefix codes for lossless data compression
    Data compression
    In computer science and information theory, data compression, source coding or bit-rate reduction is the process of encoding information using fewer bits than the original representation would use....

  • 1953 – August Albert Sardinas and George W. Patterson devise the Sardinas–Patterson algorithm
    Sardinas–Patterson algorithm
    In coding theory, the Sardinas–Patterson algorithm is a classical algorithm for determining whether a given variable-length code is uniquely decodable. The algorithm carries out a systematic search for a string which admits two different decompositions into codewords...

    , a procedure to decide whether a given variable-length code
    Variable-length code
    In coding theory a variable-length code is a code which maps source symbols to a variable number of bits.Variable-length codes can allow sources to be compressed and decompressed with zero error and still be read back symbol by symbol...

     is uniquely decodable
  • 1954 Irving S. Reed
    Irving S. Reed
    Irving Stoy Reed is a mathematician and engineer. He is best known for co-inventing a class of algebraic error-correcting and error-detecting codes known as Reed-Solomon codes in collaboration with Gustave Solomon...

     and D.E. Muller propose Reed–Muller code
    Reed–Muller code
    Reed–Muller codes are a family of linear error-correcting codes used in communications. Reed–Muller codes belong to the classes of locally testable codes and locally decodable codes, which is why they are useful in the design of probabilistically checkable proofs in computational complexity theory....

    s
  • 1955 Peter Elias
    Peter Elias
    Peter Elias was a pioneer in the field of information theory. Born in New Brunswick, New Jersey, he was a member of the Massachusetts Institute of Technology faculty from 1953 to 1991....

     introduces convolutional code
    Convolutional code
    In telecommunication, a convolutional code is a type of error-correcting code in which* each m-bit information symbol to be encoded is transformed into an n-bit symbol, where m/n is the code rate and...

    s
  • 1957 – Eugene Prange first discusses cyclic redundancy codes
  • 1959 Alexis Hocquenghem
    Alexis Hocquenghem
    Alexis Hocquenghem was a French mathematician. He is known for his discovery of Bose–Chaudhuri–Hocquenghem codes, today better known under the acronym BCH codes. This class of error correcting codes was published by Hocquenghem in 1959. The code also bears the names of R. C. Bose and...

    , and independently the next year Raj Chandra Bose
    Raj Chandra Bose
    Raj Chandra Bose was an Indian mathematician and statistician best known for his work in design theory and the theory of error-correcting codes in which the class of BCH codes is partly named after him. He was notable for his work along with S. S. Shrikhande and E. T...

     and Dwijendra Kumar Ray-Chaudhuri, discover BCH code
    BCH code
    In coding theory the BCH codes form a class of parameterised error-correcting codes which have been the subject of much academic attention in the last fifty years. BCH codes were invented in 1959 by Hocquenghem, and independently in 1960 by Bose and Ray-Chaudhuri...

    s
  • 1960 Irving S. Reed
    Irving S. Reed
    Irving Stoy Reed is a mathematician and engineer. He is best known for co-inventing a class of algebraic error-correcting and error-detecting codes known as Reed-Solomon codes in collaboration with Gustave Solomon...

     and Gustave Solomon
    Gustave Solomon
    Gustave Solomon was a mathematician and engineer who was one of the founders of the algebraic theory of error-correction. He received Ph.D. in Mathematics at MIT in 1956 under direction of Kenkichi Iwasawa....

     propose Reed–Solomon codes
  • 1962 Robert G. Gallager
    Robert G. Gallager
    Robert Gray Gallager is an American electrical engineer known for his work on information theory and communications networks. He was elected an IEEE Fellow in 1968 and a member of the National Academy of Engineering in 1979. He received the Claude E. Shannon Award from the IEEE Information Theory...

     proposes Low-density parity-check code
    Low-density parity-check code
    In information theory, a low-density parity-check code is a linear error correcting code, a method of transmitting a message over a noisy transmission channel, and is constructed using a sparse bipartite graph...

    s; they are unused for 30 years due to technical limitations.
  • 1965 – Dave Forney discusses concatenated codes.
  • 1967 Andrew Viterbi
    Andrew Viterbi
    Andrew James Viterbi, Ph.D. is an Italian-American electrical engineer and businessman who co-founded Qualcomm Inc....

     reveals the Viterbi algorithm
    Viterbi algorithm
    The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states – called the Viterbi path – that results in a sequence of observed events, especially in the context of Markov information sources, and more generally, hidden Markov models...

    , making decoding of convolutional codes practicable.
  • 1968 Elwyn Berlekamp
    Elwyn Berlekamp
    Elwyn Ralph Berlekamp is an American mathematician. He is a professor emeritus of mathematics and EECS at the University of California, Berkeley. Berlekamp is known for his work in information theory and combinatorial game theory....

     invents the Berlekamp–Massey algorithm; its application to decoding BCH and Reed-Solomon codes is pointed out by James L. Massey the following year.
  • 1968 Chris Wallace
    Chris Wallace (computer scientist)
    Professor Christopher Stewart Wallace was an Australian computer scientist notable for having devised:...

     and David M. Boulton publish the first of many papers on Minimum Message Length
    Minimum message length
    Minimum message length is a formal information theory restatement of Occam's Razor: even when models are not equal in goodness of fit accuracy to the observed data, the one generating the shortest overall message is more likely to be correct...

     (MML
    Minimum message length
    Minimum message length is a formal information theory restatement of Occam's Razor: even when models are not equal in goodness of fit accuracy to the observed data, the one generating the shortest overall message is more likely to be correct...

    ) statistical and inductive inference
  • 1970 Valerii Denisovich Goppa
    Valerii Denisovich Goppa
    Valery Denisovich Goppa is a Soviet and Russian mathematician.He discovered the relation between algebraic geometry and codes. Today these codes are called Goppa codes. In 1981 he presented his discovery at the algebra seminar of the Moscow State University...

     introduces Goppa codes
  • 1972 – J. Justesen proposes Justesen codes, an improvement of Reed-Solomon codes
  • 1973 David Slepian
    David Slepian
    David S. Slepian was an American mathematician.Born in Pittsburgh, Pennsylvania he studied B.Sc. at University of Michigan before joining the forces in World War II,as Sonic deception officer in the Ghost army....

     and Jack Wolf
    Jack Keil Wolf
    Jack Keil Wolf was an American researcher in information theory and coding theory.-Biography:Wolf was born in 1935 in Newark, New Jersey, received his undergraduate degree from the University of Pennsylvania in 1956 and his Ph.D...

     discover and prove the Slepian-Wolf coding
    Slepian-Wolf coding
    Slepian–Wolf coding is a method of coding using two compressed correlated sources....

     limits for distributed source coding
    Source coding
    In information theory, Shannon's source coding theorem establishes the limits to possible data compression, and the operational meaning of the Shannon entropy....

    .
  • 1976 Gottfried Ungerboeck
    Gottfried Ungerboeck
    Gottfried Ungerboeck is an Austrian communications engineer.Ungerboeck received an electrical engineering degree from Vienna University of Technology in 1964, and a Ph.D. from the Swiss Federal Institute of Technology, Zurich, in 1970...

     gives the first paper on trellis modulation
    Trellis modulation
    In telecommunication, trellis modulation is a modulation scheme which allows highly efficient transmission of information over band-limited channels such as telephone lines...

    ; a more detailed exposition in 1982 leads to a raising of analogue modem POTS
    Plain old telephone service
    Plain old telephone service is the voice-grade telephone service that remains the basic form of residential and small business service connection to the telephone network in many parts of the world....

     speeds from 9.6 kbit/s to 33.6 kbit/s.
  • 1976 – R. Pasco and Jorma J. Rissanen develop effective arithmetic coding
    Arithmetic coding
    Arithmetic coding is a form of variable-length entropy encoding used in lossless data compression. Normally, a string of characters such as the words "hello there" is represented using a fixed number of bits per character, as in the ASCII code...

     techniques.
  • 1977 Abraham Lempel
    Abraham Lempel
    Abraham Lempel is an Israeli computer scientist and one of the fathers of the LZ family of lossless data compression algorithms.Lempel was born on 10 February 1936 in Lwów, Poland . He studied at Technion - Israel Institute of Technology, and received a B.Sc. in 1963, M.Sc. in 1965, and D.Sc. in...

     and Jacob Ziv
    Jacob Ziv
    Jacob Ziv is an Israeli computer scientist who, along with Abraham Lempel, developed the LZ family of lossless data compression algorithms.-Biography:...

     develop Lempel–Ziv compression (LZ77)
  • 1989 Phil Katz
    Phil Katz
    Phillip Walter Katz was a computer programmer best known as the co-creator of the zip file format for data compression, and the author of PKZIP, a program for creating zip files which ran under DOS.- Career :...

     publishes the .zip format
    ZIP (file format)
    Zip is a file format used for data compression and archiving. A zip file contains one or more files that have been compressed, to reduce file size, or stored as is...

     format including DEFLATE
    DEFLATE
    Deflate is a lossless data compression algorithm that uses a combination of the LZ77 algorithm and Huffman coding. It was originally defined by Phil Katz for version 2 of his PKZIP archiving tool and was later specified in RFC 1951....

     (LZ77 + Huffman coding); later to become the most widely used archive container and most widely used lossless compression algorithm
  • 1993 Claude Berrou
    Claude Berrou
    Claude Berrou is a French professor in electrical engineering at École Nationale Supérieure des Télécommunications de Bretagne, now Telecom-Bretagne...

    , Alain Glavieux
    Alain Glavieux
    Alain Glavieux , was a French professor in electrical engineering at École Nationale Supérieure des Télécommunications de Bretagne...

     and Punya Thitimajshima
    Punya Thitimajshima
    Punya Thitimajshima , a Thai professor in the department of telecommunications engineering at King Mongkut's Institute of Technology at Ladkrabang, is the co-inventor with Claude Berrou and Alain Glavieux of a groundbreaking coding scheme called turbo codes.Thitimajshima was educated at King...

     introduce Turbo code
    Turbo code
    In information theory, turbo codes are a class of high-performance forward error correction codes developed in 1993, which were the first practical codes to closely approach the channel capacity, a theoretical maximum for the code rate at which reliable communication is still possible given a...

    s
  • 1994 Michael Burrows
    Michael Burrows
    Michael Burrows is widely known as the creator of the Burrows–Wheeler transform. He also was, with Louis Monier, one of the two main creators of AltaVista. He did his first degree in Electronic Engineering with Computer Science at University College London...

     and David Wheeler publish the Burrows–Wheeler transform, later to find use in bzip2
    Bzip2
    bzip2 is a free and open source implementation of the Burrows–Wheeler algorithm. It is developed and maintained by Julian Seward. Seward made the first public release of bzip2, version 0.15, in July 1996.-Compression efficiency:...

  • 1995 Benjamin Schumacher
    Benjamin Schumacher
    Benjamin Schumacher is a U.S. theoretical physicist, working mostly in the field of quantum information theory.He discovered a way of interpreting quantum states as information. He came up with a way of compressing the information in a state, and storing the information in a smaller number of...

     coins the term qubit
    Qubit
    In quantum computing, a qubit or quantum bit is a unit of quantum information—the quantum analogue of the classical bit—with additional dimensions associated to the quantum properties of a physical atom....

     and proves the quantum noiseless coding theorem
  • 2001 – Dr. Sam Kwong and Yu Fan Ho proposed Statistical Lempel Ziv
    Statistical Lempel Ziv
    Statistical Lempel-Ziv is a concept of lossless data compression technique published by Dr. Sam Kwong and Yu Fan Ho in 2001. It may be viewed as a variant of the Lempel-Ziv based method...

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK