Pinsker's inequality
Encyclopedia
In information theory
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...

, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker
Mark Semenovich Pinsker
Mark Semenovich Pinsker or Mark Shlemovich Pinsker was a noted Russian mathematician in the fields of information theory, probability theory, coding theory, ergodic theory, mathematical statistics, and communication networks....

, is an inequality that relates Kullback-Leibler divergence and the total variation distance. It states that if P, Q are two probability distribution
Probability distribution
In probability theory, a probability mass, probability density, or probability distribution is a function that describes the probability of a random variable taking certain values....

s, then


where D(P || Q) is the Kullback-Leibler divergence in nats
Nat (information)
A nat is a logarithmic unit of information or entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms which define the bit. The nat is the natural unit for information entropy...

and


is the total variation distance.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK