List of information theory topics
Encyclopedia
This is a list of information theory topics, by Wikipedia page.
  • A Mathematical Theory of Communication
    A Mathematical Theory of Communication
    "A Mathematical Theory of Communication" is an influential 1948 article by mathematician Claude E. Shannon. As of November 2011, Google Scholar has listed more than 48,000 unique citations of the article and the later-published book version...

  • algorithmic information theory
    Algorithmic information theory
    Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information...

  • arithmetic encoding
  • channel capacity
    Channel capacity
    In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel...

  • Communication Theory of Secrecy Systems
    Communication Theory of Secrecy Systems
    Communication Theory of Secrecy Systems is a paper published in 1949 by Claude Shannon discussing cryptography from the viewpoint of information theory. It is one of the foundational treatments of modern cryptography...

  • conditional entropy
    Conditional entropy
    In information theory, the conditional entropy quantifies the remaining entropy of a random variable Y given that the value of another random variable X is known. It is referred to as the entropy of Y conditional on X, and is written H...

  • conditional quantum entropy
    Conditional quantum entropy
    The conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information theory...

  • confusion and diffusion
    Confusion and diffusion
    In cryptography, confusion and diffusion are two properties of the operation of a secure cipher which were identified by Claude Shannon in his paper Communication Theory of Secrecy Systems, published in 1949....

  • cross entropy
    Cross entropy
    In information theory, the cross entropy between two probability distributions measures the average number of bits needed to identify an event from a set of possibilities, if a coding scheme is used based on a given probability distribution q, rather than the "true" distribution p.The cross entropy...

  • data compression
    Data compression
    In computer science and information theory, data compression, source coding or bit-rate reduction is the process of encoding information using fewer bits than the original representation would use....

  • entropy encoding
    Entropy encoding
    In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium....

  • Fisher information
    Fisher information
    In mathematical statistics and information theory, the Fisher information is the variance of the score. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior...

  • Hick's law
    Hick's law
    Hick's Law, named after British psychologist William Edmund Hick, or the Hick–Hyman Law , describes the time it takes for a person to make a decision as a result of the possible choices he or she has. The Hick-Hyman Law assesses cognitive information capacity in choice reaction experiments...

  • Hirchman uncertainty
  • Huffman encoding
  • information bottleneck method
    Information bottleneck method
    The information bottleneck method is a technique introduced by Naftali Tishby et al. [1] for finding the best tradeoff between accuracy and complexity when summarizing a random variable X, given a joint probability distribution between X and an observed relevant variable Y...

  • information entropy
    Information entropy
    In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...

  • information theoretic security
    Information theoretic security
    A cryptosystem is information-theoretically secure if its security derives purely from information theory. That is, it is secure even when the adversary has unlimited computing power. The adversary simply does not have enough information to break the security...

  • information theory
    Information theory
    Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...

  • joint entropy
  • Kullback-Leibler divergence
  • lossless data compression
    Lossless data compression
    Lossless data compression is a class of data compression algorithms that allows the exact original data to be reconstructed from the compressed data. The term lossless is in contrast to lossy data compression, which only allows an approximation of the original data to be reconstructed, in exchange...

  • negentropy
    Negentropy
    The negentropy, also negative entropy or syntropy, of a living system is the entropy that it exports to keep its own entropy low; it lies at the intersection of entropy and life...

  • principle of maximum entropy
    Principle of maximum entropy
    In Bayesian probability, the principle of maximum entropy is a postulate which states that, subject to known constraints , the probability distribution which best represents the current state of knowledge is the one with largest entropy.Let some testable information about a probability distribution...

  • quantum information science
    Quantum information science
    Quantum information science is an area of study based on the idea that information science depends on quantum effects in physics. It includes theoretical issues in computational models as well as more experimental topics in quantum physics including what can and cannot be done with quantum...

  • range encoding
    Range encoding
    Range encoding is a data compression method defined by G. Nigel N. Martin in a 1979 paper Range encoding is a form of arithmetic coding that was historically of interest for avoiding some patents on particular later-developed arithmetic coding techniques...

  • redundancy
    Redundancy (information theory)
    Redundancy in information theory is the number of bits used to transmit a message minus the number of bits of actual information in the message. Informally, it is the amount of wasted "space" used to transmit certain data...

  • Rényi entropy
    Rényi entropy
    In information theory, the Rényi entropy, a generalisation of Shannon entropy, is one of a family of functionals for quantifying the diversity, uncertainty or randomness of a system...

  • self-information
    Self-information
    In information theory, self-information is a measure of the information content associated with the outcome of a random variable. It is expressed in a unit of information, for example bits,nats,or...

  • Shannon limit
  • Shannon's law
    Shannon's law
    Shannon's law may refer to:* Shannon–Hartley theorem, any statement defining the theoretical maximum rate at which error-free digits can be transmitted over a bandwidth-limited channel in the presence of noise...

  • Shannon's theorem
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK