Prediction by partial matching
) is an adaptive statistical
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....
In computer science and information theory, data compression, source coding or bit-rate reduction is the process of encoding information using fewer bits than the original representation would use....
technique based on context modeling and prediction
A prediction or forecast is a statement about the way things will happen in the future, often but not always based on experience or knowledge...
. PPM models use a set of previous symbols in the uncompressed symbol stream to predict the next symbol in the stream.
Predictions are usually reduced to symbol rankings. The number of previous symbols, n
, determines the order of the PPM model which is denoted as PPM(n
). Unbounded variants where the context has no length limitations also exist and are denoted as PPM*
. If no prediction can be made based on all n context symbols a prediction is attempted with n
− 1 symbols. This process is repeated until a match is found or no more symbols remain in context. At that point a fixed prediction is made. This process is complementary to that followed by dynamic Markov compression
Dynamic Markov compression is a lossless data compression algorithm developed by Gordon Cormack and Nigel Horspool . It uses predictive arithmetic coding similar to prediction by partial matching , except that the input is predicted one bit at a time...
(DMC) which builds up from a zero-order model.
Much of the work in optimizing a PPM model is handling inputs that have not already occurred in the input stream. The obvious way to handle them is to create a "never-seen" symbol which triggers the escape sequence
An escape sequence is a series of characters used to change the state of computers and their attached peripheral devices. These are also known as control sequences, reflecting their use in device control. Some control sequences are special characters that always have the same meaning...
. But what probability should be assigned to a symbol that has never been seen? This is called the zero-frequency problem. One variant uses the Laplace estimator, which assigns the "never-seen" symbol a fixed pseudocount
A pseudocount is an amount added to the number of observed cases in order to change the expected probability in a model of those data, when not known to be zero. Depending on the prior knowledge, which is sometimes a subjective value, a pseudocount may have any non-negative finite value...
of one. A variant called PPMD
increments the pseudocount of the "never-seen" symbol every time the "never-seen" symbol is used. (In other words, PPMD estimates the probability of a new symbol as the ratio of the number of unique symbols to the total number of symbols observed).
PPM compression implementations vary greatly in other details. The actual symbol selection is usually recorded using arithmetic coding
Arithmetic coding is a form of variable-length entropy encoding used in lossless data compression. Normally, a string of characters such as the words "hello there" is represented using a fixed number of bits per character, as in the ASCII code...
, though it is also possible to use Huffman encoding or even some type of dictionary coding
A dictionary coder, also sometimes known as a substitution coder, is a class of lossless data compression algorithms which operate by searching for matches between the text to be compressed and a set of strings contained in a data structure maintained by the encoder...
technique. The underlying model used in most PPM algorithms can also be extended to predict multiple symbols. It is also possible to use non-Markov modeling to either replace or supplement Markov modeling. The symbol size is usually static, typically a single byte, which makes generic handling of any file format easy.
Published research on this family of algorithms can be found as far back as the mid-1980s. Software implementations were not popular until the early 1990s because PPM algorithms require a significant amount of RAM. Recent PPM implementations are among the best-performing lossless compression programs for natural language
In the philosophy of language, a natural language is any language which arises in an unpremeditated fashion as the result of the innate facility for language possessed by the human intellect. A natural language is typically used for communication, and may be spoken, signed, or written...
Trying to improve PPM algorithms led to the PAQ
PAQ is a series of lossless data compression archivers that have evolved through collaborative development to top rankings on several benchmarks measuring compression ratio . Specialized versions of PAQ have won the Hutter Prize and the Calgary Challenge...
series of data compression algorithms.
A PPM algorithm, rather than being used for compression, is used to increase the efficiency of user input in the alternate input method program Dasher
Dasher is a computer accessibility tool which enables users to write without using a keyboard, by entering text on a screen using a pointing device such as a mouse, a touchpad, a touch screen, a roller ball, a joystick, a Push-button, a Wii Remote, or even mice operated by the foot or head...