Information diagram
Encyclopedia
An information diagram is a type of Venn diagram
Venn diagram
Venn diagrams or set diagrams are diagrams that show all possible logical relations between a finite collection of sets . Venn diagrams were conceived around 1880 by John Venn...

 used in information theory
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...

 to illustrate relationships among Shannon's basic measures of information
Quantities of information
The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of...

: entropy
Information entropy
In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...

, joint entropy, conditional entropy
Conditional entropy
In information theory, the conditional entropy quantifies the remaining entropy of a random variable Y given that the value of another random variable X is known. It is referred to as the entropy of Y conditional on X, and is written H...

 and mutual information
Mutual information
In probability theory and information theory, the mutual information of two random variables is a quantity that measures the mutual dependence of the two random variables...

. An information diagram is a very useful pedagogical tool for teaching and learning about these basic measures of information, but using such a diagram carries some non-trivial implications. For example, Shannon's entropy in the context of an information diagram must be taken as a signed measure
Signed measure
In mathematics, signed measure is a generalization of the concept of measure by allowing it to have negative values. Some authors may call it a charge, by analogy with electric charge, which is a familiar distribution that takes on positive and negative values.-Definition:There are two slightly...

. (See the article Information theory and measure theory
Information theory and measure theory
- Measures in information theory :Many of the formulas in information theory have separate versions for continuous and discrete cases, i.e. integrals for the continuous case and sums for the discrete case. These versions can often be generalized using measure theory...

for more information.)

Random variables X, Y, and Z are said to form a Markov chain
Markov chain
A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the...

if Z and X are independent given Y. Thus, Y contains all the information of X that is relevant to Z. Thus, knowing X does not add anything to one's knowledge of Z when Y is given.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK