List of probability topics
Encyclopedia
This is a list of probability topics, by Wikipedia page.
It overlaps with the (alphabetical) list of statistical topics. There are also the topic outline of probability, catalog of articles in probability theory
Catalog of articles in probability theory
This page lists articles related to Probability theory. In particular, it lists many articles corresponding to specific probability distributions. Such articles are marked here by a code of the form , which refers to number of random variables involved and the type of the distribution. For example ...

, list of probabilists and list of statisticians.

General aspects

  • Probability
    Probability
    Probability is ordinarily used to describe an attitude of mind towards some proposition of whose truth we arenot certain. The proposition of interest is usually of the form "Will a specific event occur?" The attitude of mind is of the form "How certain are we that the event will occur?" The...

  • Randomness
    Randomness
    Randomness has somewhat differing meanings as used in various fields. It also has common meanings which are connected to the notion of predictability of events....

    , Pseudorandomness
    Pseudorandomness
    A pseudorandom process is a process that appears to be random but is not. Pseudorandom sequences typically exhibit statistical randomness while being generated by an entirely deterministic causal process...

    , Quasirandom
  • Randomization
    Randomization
    Randomization is the process of making something random; this means:* Generating a random permutation of a sequence .* Selecting a random sample of a population ....

    , hardware random number generator
    Hardware random number generator
    In computing, a hardware random number generator is an apparatus that generates random numbers from a physical process. Such devices are often based on microscopic phenomena that generate a low-level, statistically random "noise" signal, such as thermal noise or the photoelectric effect or other...

  • Random number generator
  • Random sequence
    Random sequence
    The concept of a random sequence is essential in probability theory and statistics. The concept generally relies on the notion of a sequence of random variables and many statistical discussions begin with the words "let X1,...,Xn be independent random variables...". Yet as D. H. Lehmer stated in...

  • Coin flipping
    Coin flipping
    Coin flipping or coin tossing or heads or tails is the practice of throwing a coin in the air to choose between two alternatives, sometimes to resolve a dispute between two parties...

    /tossing; checking if a coin is biased
  • Uncertainty
    Uncertainty
    Uncertainty is a term used in subtly different ways in a number of fields, including physics, philosophy, statistics, economics, finance, insurance, psychology, sociology, engineering, and information science...

  • Statistical dispersion
    Statistical dispersion
    In statistics, statistical dispersion is variability or spread in a variable or a probability distribution...

  • Observational error
    Observational error
    Observational error is the difference between a measured value of quantity and its true value. In statistics, an error is not a "mistake". Variability is an inherent part of things being measured and of the measurement process.-Science and experiments:...

  • Equiprobable
    Equiprobable
    Equiprobability is a philosophical concept in probability theory that allows one to assign equal probabilities to outcomes when they are judged to be equipossible or to be "equally likely" in some sense...

    • Equipossible
      Equipossible
      Equipossibility is a philosophical concept in possibility theory that is a precursor to the notion of equiprobability in probability theory. It is used to distinguish what can occur in a probability experiment...

  • Average
    Average
    In mathematics, an average, or central tendency of a data set is a measure of the "middle" value of the data set. Average is one form of central tendency. Not all central tendencies should be considered definitions of average....

  • Probability interpretations
    Probability interpretations
    The word probability has been used in a variety of ways since it was first coined in relation to games of chance. Does probability measure the real, physical tendency of something to occur, or is it just a measure of how strongly one believes it will occur? In answering such questions, we...

  • Markovian
  • Statistical regularity
    Statistical regularity
    Statistical regularity is a notion in statistics and probability theory that random events exhibit regularity when repeated enough times or that enough sufficiently similar random events exhibit regularity...

  • Central tendency
    Central tendency
    In statistics, the term central tendency relates to the way in which quantitative data is clustered around some value. A measure of central tendency is a way of specifying - central value...

  • Bean machine
    Bean machine
    The bean machine, also known as the quincunx or Galton box, is a device invented by Sir Francis Galton to demonstrate the central limit theorem, in particular that the normal distribution is approximate to the binomial distribution....

  • Relative frequency
  • Frequency probability
    Frequency probability
    Frequency probability is the interpretation of probability that defines an event's probability as the limit of its relative frequency in a large number of trials. The development of the frequentist account was motivated by the problems and paradoxes of the previously dominant viewpoint, the...

  • Maximum likelihood
    Maximum likelihood
    In statistics, maximum-likelihood estimation is a method of estimating the parameters of a statistical model. When applied to a data set and given a statistical model, maximum-likelihood estimation provides estimates for the model's parameters....

  • Bayesian probability
    Bayesian probability
    Bayesian probability is one of the different interpretations of the concept of probability and belongs to the category of evidential probabilities. The Bayesian interpretation of probability can be seen as an extension of logic that enables reasoning with propositions, whose truth or falsity is...

  • Principle of indifference
    Principle of indifference
    The principle of indifference is a rule for assigning epistemic probabilities.Suppose that there are n > 1 mutually exclusive and collectively exhaustive possibilities....

  • Cox's theorem
    Cox's theorem
    Cox's theorem, named after the physicist Richard Threlkeld Cox, is a derivation of the laws of probability theory from a certain set of postulates. This derivation justifies the so-called "logical" interpretation of probability. As the laws of probability derived by Cox's theorem are applicable to...

  • Principle of maximum entropy
    Principle of maximum entropy
    In Bayesian probability, the principle of maximum entropy is a postulate which states that, subject to known constraints , the probability distribution which best represents the current state of knowledge is the one with largest entropy.Let some testable information about a probability distribution...

  • Information entropy
    Information entropy
    In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...

  • Urn problem
    Urn problem
    In probability and statistics, an urn problem is an idealized mental exercise in which some objects of real interest are represented as colored balls in an urn or other container....

    s
  • Extractor
  • Aleatoric, aleatoric music
    Aleatoric music
    Aleatoric music is music in which some element of the composition is left to chance, and/or some primary element of a composed work's realization is left to the determination of its performer...

  • Free probability
    Free probability
    Free probability is a mathematical theory that studies non-commutative random variables. The "freeness" or free independence property is the analogue of the classical notion of independence, and it is connected with free products....

  • Exotic probability
    Exotic probability
    Exotic probability is a branch of probability theory that deals with probabilities which are outside the normal range of [0, 1]. The most common author of papers on exotic probability theory is Saul Youssef...

  • Schrödinger method
    Schrödinger method
    In combinatorial mathematics and probability theory, the Schrödinger method, named after the Austrian physicist Erwin Schrödinger, is used to solve some problems of distribution and occupancy.SupposeX_1,\dots,X_n \,...

  • Empirical measure
    Empirical measure
    In probability theory, an empirical measure is a random measure arising from a particular realization of a sequence of random variables. The precise definition is found below. Empirical measures are relevant to mathematical statistics....

  • Glivenko–Cantelli theorem
  • Zero-one law
    Zero-one law
    In probability theory, a zero-one law is a result that states that an event must have probability 0 or 1 and no intermediate value. Sometimes, the statement is that the limit of certain probabilities must be 0 or 1.It may refer to:...

    • Kolmogorov's zero-one law
      Kolmogorov's zero-one law
      In probability theory, Kolmogorov's zero-one law, named in honor of Andrey Nikolaevich Kolmogorov, specifies that a certain type of event, called a tail event, will either almost surely happen or almost surely not happen; that is, the probability of such an event occurring is zero or one.Tail...

    • Hewitt–Savage zero-one law
  • Law of Truly Large Numbers
    Law of Truly Large Numbers
    The law of truly large numbers, attributed to Persi Diaconis and Frederick Mosteller, states that with a sample size large enough, any outrageous thing is likely to happen. Because we never find it notable when likely events occur, we highlight unlikely events and notice them more...

    • Littlewood's law
      Littlewood's law
      Littlewood's Law states that individuals can expect a "miracle" to happen to them at the rate of about one per month.-History:The law was framed by Cambridge University Professor J. E...

    • Infinite monkey theorem
      Infinite monkey theorem
      The infinite monkey theorem states that a monkey hitting keys at random on a typewriter keyboard for an infinite amount of time will almost surely type a given text, such as the complete works of William Shakespeare....

  • Littlewood–Offord problem
  • Inclusion-exclusion principle
    Inclusion-exclusion principle
    In combinatorics, the inclusion–exclusion principle is an equation relating the sizes of two sets and their union...

  • Impossible event
    Impossible event
    In the mathematics of probability, an impossible event is an event A with probability zero, or Pr = 0. See in particular almost surely.An impossible event is not the same as the stronger concept of logical impossibility...

  • Information geometry
    Information geometry
    Information geometry is a branch of mathematics that applies the techniques of differential geometry to the field of probability theory. It derives its name from the fact that the Fisher information is used as the Riemannian metric when considering the geometry of probability distribution families...

  • Talagrand's concentration inequality
    Talagrand's concentration inequality
    In probability theory, Talagrand's concentration inequality, is an isoperimetric-type inequality for product probability spaces. It was first proved by the French mathematician Michel Talagrand...


Foundations of probability theory

  • Probability theory
    Probability theory
    Probability theory is the branch of mathematics concerned with analysis of random phenomena. The central objects of probability theory are random variables, stochastic processes, and events: mathematical abstractions of non-deterministic events or measured quantities that may either be single...

  • Probability space
    Probability space
    In probability theory, a probability space or a probability triple is a mathematical construct that models a real-world process consisting of states that occur randomly. A probability space is constructed with a specific kind of situation or experiment in mind...

    • Sample space
    • Standard probability space
      Standard probability space
      In probability theory, a standard probability space is a probability space satisfying certain assumptions introduced by Vladimir Rokhlin in 1940...

    • Random element
      Random element
      In probability theory, random element is a generalization of the concept of random variable to more complicated spaces than the simple real line...

      • Random compact set
        Random compact set
        In mathematics, a random compact set is essentially a compact set-valued random variable. Random compact sets are useful in the study of attractors for random dynamical systems.-Definition:...

    • Dynkin system
      Dynkin system
      A Dynkin system, named after Eugene Dynkin, is a collection of subsets of another universal set \Omega satisfying a set of axioms weaker than those of σ-algebra. Dynkin systems are sometimes referred to as λ-systems or d-system...

  • Probability axioms
    Probability axioms
    In probability theory, the probability P of some event E, denoted P, is usually defined in such a way that P satisfies the Kolmogorov axioms, named after Andrey Kolmogorov, which are described below....

  • Normalizing constant
    Normalizing constant
    The concept of a normalizing constant arises in probability theory and a variety of other areas of mathematics.-Definition and examples:In probability theory, a normalizing constant is a constant by which an everywhere non-negative function must be multiplied so the area under its graph is 1, e.g.,...

  • Event (probability theory)
    Event (probability theory)
    In probability theory, an event is a set of outcomes to which a probability is assigned. Typically, when the sample space is finite, any subset of the sample space is an event...

    • Complementary event
      Complementary event
      In probability theory, the complement of any event A is the event [not A], i.e. the event that A does not occur. The event A and its complement [not A] are mutually exclusive and exhaustive. Generally, there is only one event B such that A and B are both mutually exclusive and...

  • Elementary event
    Elementary event
    In probability theory, an elementary event or atomic event is a singleton of a sample space. An outcome is an element of a sample space. An elementary event is a set containing exactly one outcome, not the outcome itself...

  • Mutually exclusive
    Mutually exclusive
    In layman's terms, two events are mutually exclusive if they cannot occur at the same time. An example is tossing a coin once, which can result in either heads or tails, but not both....

  • Boole's inequality
    Boole's inequality
    In probability theory, Boole's inequality, also known as the union bound, says that for any finite or countable set of events, the probability that at least one of the events happens is no greater than the sum of the probabilities of the individual events...

  • Probability density function
    Probability density function
    In probability theory, a probability density function , or density of a continuous random variable is a function that describes the relative likelihood for this random variable to occur at a given point. The probability for the random variable to fall within a particular region is given by the...

  • Cumulative distribution function
    Cumulative distribution function
    In probability theory and statistics, the cumulative distribution function , or just distribution function, describes the probability that a real-valued random variable X with a given probability distribution will be found at a value less than or equal to x. Intuitively, it is the "area so far"...

  • Law of total cumulance
    Law of total cumulance
    In probability theory and mathematical statistics, the law of total cumulance is a generalization to cumulants of the law of total probability, the law of total expectation, and the law of total variance. It has applications in the analysis of time series...

  • Law of total expectation
    Law of total expectation
    The proposition in probability theory known as the law of total expectation, the law of iterated expectations, the tower rule, the smoothing theorem, among other names, states that if X is an integrable random variable The proposition in probability theory known as the law of total expectation, ...

  • Law of total probability
    Law of total probability
    In probability theory, the law of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.-Statement:The law of total probability is the proposition that if \left\...

  • Law of total variance
    Law of total variance
    In probability theory, the law of total variance or variance decomposition formula states that if X and Y are random variables on the same probability space, and the variance of Y is finite, then...

  • Almost surely
    Almost surely
    In probability theory, one says that an event happens almost surely if it happens with probability one. The concept is analogous to the concept of "almost everywhere" in measure theory...

  • Cox's theorem
    Cox's theorem
    Cox's theorem, named after the physicist Richard Threlkeld Cox, is a derivation of the laws of probability theory from a certain set of postulates. This derivation justifies the so-called "logical" interpretation of probability. As the laws of probability derived by Cox's theorem are applicable to...

  • Bayesianism
  • Prior probability
    Prior probability
    In Bayesian statistical inference, a prior probability distribution, often called simply the prior, of an uncertain quantity p is the probability distribution that would express one's uncertainty about p before the "data"...

  • Posterior probability
    Posterior probability
    In Bayesian statistics, the posterior probability of a random event or an uncertain proposition is the conditional probability that is assigned after the relevant evidence is taken into account...

  • Borel's paradox
    Borel's paradox
    In probability theory, the Borel–Kolmogorov paradox is a paradox relating to conditional probability with respect to an event of probability zero...

  • Bertrand's paradox
    Bertrand's paradox (probability)
    The Bertrand paradox is a problem within the classical interpretation of probability theory. Joseph Bertrand introduced it in his work Calcul des probabilités as an example to show that probabilities may not be well defined if the mechanism or method that produces the random variable is not...

  • Coherence (philosophical gambling strategy)
    Coherence (philosophical gambling strategy)
    In a thought experiment proposed by the Italian probabilist Bruno de Finetti in order to justify Bayesian probability, an array of wagers is coherent precisely if it does not expose the wagerer to certain loss regardless of the outcomes of events on which he is wagering, even if his opponent makes...

  • Dutch book
    Dutch book
    In gambling a Dutch book or lock is a set of odds and bets which guarantees a profit, regardless of the outcome of the gamble. It is associated with probabilities implied by the odds not being coherent....

  • Algebra of random variables
    Algebra of random variables
    In the algebraic axiomatization of probability theory, the primary concept is not that of probability of an event, but rather that of a random variable. Probability distributions are determined by assigning an expectation to each random variable...

  • Belief propagation
    Belief propagation
    Belief propagation is a message passing algorithm for performing inference on graphical models, such as Bayesian networks and Markov random fields. It calculates the marginal distribution for each unobserved node, conditional on any observed nodes...

  • Transferable belief model
    Transferable belief model
    The transferable belief model is an elaboration on the Dempster-Shafer theory of evidence.-Context:Consider the following classical problem of information fusion. A patient has an illness that can be caused by three different factors A, B and C...

  • Dempster-Shafer theory
    Dempster-Shafer theory
    The Dempster–Shafer theory is a mathematical theory of evidence. It allows one to combine evidence from different sources and arrive at a degree of belief that takes into account all the available evidence. The theory was first developed by Arthur P...

  • Possibility theory
    Possibility theory
    Possibility theory is a mathematical theory for dealing with certain types of uncertainty and is an alternative to probability theory. Professor Lotfi Zadeh first introduced possibility theory in 1978 as an extension of his theory of fuzzy sets and fuzzy logic. D. Dubois and H. Prade further...


Random variable
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...

s

  • Discrete random variable
    • Probability mass function
      Probability mass function
      In probability theory and statistics, a probability mass function is a function that gives the probability that a discrete random variable is exactly equal to some value...

  • Constant random variable
  • Expected value
    Expected value
    In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...

    • Jensen's inequality
      Jensen's inequality
      In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906. Given its generality, the inequality appears in many forms depending on the context,...

  • Variance
    Variance
    In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...

    • Standard deviation
      Standard deviation
      Standard deviation is a widely used measure of variability or diversity used in statistics and probability theory. It shows how much variation or "dispersion" there is from the average...

    • Geometric standard deviation
      Geometric standard deviation
      In probability theory and statistics, the geometric standard deviation describes how spread out are a set of numbers whose preferred average is the geometric mean...

  • Multivariate random variable
    Multivariate random variable
    In mathematics, probability, and statistics, a multivariate random variable or random vector is a list of mathematical variables each of whose values is unknown, either because the value has not yet occurred or because there is imperfect knowledge of its value.More formally, a multivariate random...

    • Joint probability distribution
    • Marginal distribution
      Marginal distribution
      In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. The term marginal variable is used to refer to those variables in the subset of variables being retained...

    • Kirkwood approximation
      Kirkwood approximation
      The Kirkwood superposition approximation was introduced by Matsuda as a means of representing a discrete probability distribution. The name apparently refers to a 1942 paper by John G. Kirkwood...

  • Independent identically-distributed random variables
    • Independent and identically-distributed random variables
  • Statistical independence
    Statistical independence
    In probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other occurs...

    • Conditional independence
      Conditional independence
      In probability theory, two events R and B are conditionally independent given a third event Y precisely if the occurrence or non-occurrence of R and the occurrence or non-occurrence of B are independent events in their conditional probability distribution given Y...

    • Pairwise independence
      Pairwise independence
      In probability theory, a pairwise independent collection of random variables is a set of random variables any two of which are independent. Any collection of mutually independent random variables is pairwise independent, but some pairwise independent collections are not mutually independent...

    • Covariance
      Covariance
      In probability theory and statistics, covariance is a measure of how much two variables change together. Variance is a special case of the covariance when the two variables are identical.- Definition :...

    • Covariance matrix
      Covariance matrix
      In probability theory and statistics, a covariance matrix is a matrix whose element in the i, j position is the covariance between the i th and j th elements of a random vector...

    • De Finetti's theorem
      De Finetti's theorem
      In probability theory, de Finetti's theorem explains why exchangeable observations are conditionally independent given some latent variable to which an epistemic probability distribution would then be assigned...

  • Correlation
    Correlation
    In statistics, dependence refers to any statistical relationship between two random variables or two sets of data. Correlation refers to any of a broad class of statistical relationships involving dependence....

    • Uncorrelated
      Uncorrelated
      In probability theory and statistics, two real-valued random variables are said to be uncorrelated if their covariance is zero. Uncorrelatedness is by definition pairwise; i.e...

    • Correlation function
      Correlation function
      A correlation function is the correlation between random variables at two different points in space or time, usually as a function of the spatial or temporal distance between the points...

  • Canonical correlation
    Canonical correlation
    In statistics, canonical correlation analysis, introduced by Harold Hotelling, is a way of making sense of cross-covariance matrices. If we have two sets of variables, x_1, \dots, x_n and y_1, \dots, y_m, and there are correlations among the variables, then canonical correlation analysis will...

  • Convergence of random variables
    Convergence of random variables
    In probability theory, there exist several different notions of convergence of random variables. The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications to statistics and stochastic processes...

    • Weak convergence of measures
      • Helly–Bray theorem
        Helly–Bray theorem
        In probability theory, the Helly–Bray theorem relates the weak convergence of cumulative distribution functions to the convergence of expectations of certain measurable functions. It is named after Eduard Helly and Hubert Evelyn Bray....

      • Slutsky's theorem
        Slutsky's theorem
        In probability theory, Slutsky’s theorem extends some properties of algebraic operations on convergent sequences of real numbers to sequences of random variables.The theorem was named after Eugen Slutsky. Slutsky’s theorem is also attributed to Harald Cramér....

    • Skorokhod's representation theorem
      Skorokhod's representation theorem
      In mathematics and statistics, Skorokhod's representation theorem is a result that shows that a weakly convergent sequence of probability measures whose limit measure is sufficiently well-behaved can be represented as the distribution/law of a pointwise convergent sequence of random variables...

    • Lévy's continuity theorem
      Lévy's continuity theorem
      In probability theory, the Lévy’s continuity theorem, named after the French mathematician Paul Lévy, connects convergence in distribution of the sequence of random variables with pointwise convergence of their characteristic functions...

    • Uniform integrability
  • Markov's inequality
    Markov's inequality
    In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant...

  • Chebyshev's inequality
    Chebyshev's inequality
    In probability theory, Chebyshev’s inequality guarantees that in any data sample or probability distribution,"nearly all" values are close to the mean — the precise statement being that no more than 1/k2 of the distribution’s values can be more than k standard deviations away from the mean...

     = Chernoff bound
    Chernoff bound
    In probability theory, the Chernoff bound, named after Herman Chernoff, gives exponentially decreasing bounds on tail distributions of sums of independent random variables...

  • Chernoff's inequality
  • Bernstein inequalities (probability theory)
    • Hoeffding's inequality
      Hoeffding's inequality
      In probability theory, Hoeffding's inequality provides an upper bound on the probability for the sum of random variables to deviate from its expected value. Hoeffding's inequality was proved by Wassily Hoeffding.LetX_1, \dots, X_n \!...

  • Kolmogorov's inequality
    Kolmogorov's inequality
    In probability theory, Kolmogorov's inequality is a so-called "maximal inequality" that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound...

  • Etemadi's inequality
    Etemadi's inequality
    In probability theory, Etemadi's inequality is a so-called "maximal inequality", an inequality that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound...

  • Khintchine inequality
    Khintchine inequality
    In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Roman alphabet, is a theorem from probability, and is also frequently used in analysis...

  • Paley–Zygmund inequality
    Paley–Zygmund inequality
    In mathematics, the Paley–Zygmund inequality bounds theprobability that a positive random variable is small, in terms ofits mean and variance...

  • Laws of large numbers
    • Asymptotic equipartition property
      Asymptotic equipartition property
      In information theory the asymptotic equipartition property is a general property of the output samples of a stochastic source. It is fundamental to the concept of typical set used in theories of compression....

    • Typical set
      Typical set
      In information theory, the typical set is a set of sequences whose probability is close to two raised to the negative power of the entropy of their source distribution. That this set has total probability close to one is a consequence of the asymptotic equipartition property which is a kind of law...

    • Law of large numbers
      Law of large numbers
      In probability theory, the law of large numbers is a theorem that describes the result of performing the same experiment a large number of times...

  • Random field
    Random field
    A random field is a generalization of a stochastic process such that the underlying parameter need no longer be a simple real or integer valued "time", but can instead take values that are multidimensional vectors, or points on some manifold....

    • Conditional random field
      Conditional random field
      A conditional random field is a statistical modelling method often applied in pattern recognition.More specifically it is a type of discriminative undirected probabilistic graphical model. It is used to encode known relationships between observations and construct consistent interpretations...

  • Borel-Cantelli lemma
    Borel-Cantelli lemma
    In probability theory, the Borel–Cantelli lemma is a theorem about sequences of events. In general, it is a result in measure theory. It is named after Émile Borel and Francesco Paolo Cantelli...

  • Wick product
    Wick product
    In probability theory, the Wick product\langle X_1,\dots,X_k \rangle\,named after physicist Gian-Carlo Wick, is a sort of product of the random variables, X1, ..., Xk, defined recursively as follows:\langle \rangle = 1\,...


Conditional probability
Conditional probability
In probability theory, the "conditional probability of A given B" is the probability of A if B is known to occur. It is commonly notated P, and sometimes P_B. P can be visualised as the probability of event A when the sample space is restricted to event B...

  • Conditioning (probability)
    Conditioning (probability)
    Beliefs depend on the available information. This idea is formalized in probability theory by conditioning. Conditional probabilities, conditional expectations and conditional distributions are treated on three levels: discrete probabilities, probability density functions, and measure theory...

  • Conditional expectation
    Conditional expectation
    In probability theory, a conditional expectation is the expected value of a real random variable with respect to a conditional probability distribution....

  • Conditional probability distribution
  • Regular conditional probability
    Regular conditional probability
    Regular conditional probability is a concept that has developed to overcome certain difficulties in formally defining conditional probabilities for continuous probability distributions...

  • Disintegration theorem
    Disintegration theorem
    In mathematics, the disintegration theorem is a result in measure theory and probability theory. It rigorously defines the idea of a non-trivial "restriction" of a measure to a measure zero subset of the measure space in question. It is related to the existence of conditional probability measures...

  • Bayes' theorem
    Bayes' theorem
    In probability theory and applications, Bayes' theorem relates the conditional probabilities P and P. It is commonly used in science and engineering. The theorem is named for Thomas Bayes ....

  • de Finetti's theorem
    De Finetti's theorem
    In probability theory, de Finetti's theorem explains why exchangeable observations are conditionally independent given some latent variable to which an epistemic probability distribution would then be assigned...

    • Exchangeable random variables
  • Rule of succession
    Rule of succession
    In probability theory, the rule of succession is a formula introduced in the 18th century by Pierre-Simon Laplace in the course of treating the sunrise problem....

  • Conditional independence
    Conditional independence
    In probability theory, two events R and B are conditionally independent given a third event Y precisely if the occurrence or non-occurrence of R and the occurrence or non-occurrence of B are independent events in their conditional probability distribution given Y...

  • Conditional event algebra
    Conditional event algebra
    A conditional event algebra is an algebraic structure whose domain consists of logical objects described by statements of forms such as "If A, then B," "B, given A," and "B, in case A." Unlike the standard Boolean algebra of events, a CEA allows the defining of a probability function, P, which...

    • Goodman-Nguyen-van Fraassen algebra
      Goodman-Nguyen-van Fraassen algebra
      A Goodman–Nguyen–van Fraassen algebra is a type of conditional event algebra that embeds the standard Boolean algebra of unconditional events in a larger algebra which is itself Boolean...


Probability distribution
Probability distribution
In probability theory, a probability mass, probability density, or probability distribution is a function that describes the probability of a random variable taking certain values....

s

  • Probability distribution function
    Probability distribution function
    Depending upon which text is consulted, a probability distribution function is any of:* a probability distribution function,* a cumulative distribution function,* a probability mass function, or* a probability density function....

  • Quantile
    Quantile
    Quantiles are points taken at regular intervals from the cumulative distribution function of a random variable. Dividing ordered data into q essentially equal-sized data subsets is the motivation for q-quantiles; the quantiles are the data values marking the boundaries between consecutive subsets...

  • Moment (mathematics)
    Moment (mathematics)
    In mathematics, a moment is, loosely speaking, a quantitative measure of the shape of a set of points. The "second moment", for example, is widely used and measures the "width" of a set of points in one dimension or in higher dimensions measures the shape of a cloud of points as it could be fit by...

    • Moment about the mean
    • Standardized moment
      • Skewness
        Skewness
        In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. The skewness value can be positive or negative, or even undefined...

      • Kurtosis
        Kurtosis
        In probability theory and statistics, kurtosis is any measure of the "peakedness" of the probability distribution of a real-valued random variable...

      • Locality
    • Cumulant
      Cumulant
      In probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have...

    • Factorial moment
    • Expected value
      Expected value
      In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...

      • Law of the unconscious statistician
        Law of the unconscious statistician
        In probability theory and statistics, the law of the unconscious statistician is a theorem used to calculate the expected value of a function g of a random variable X when one knows the probability distribution of X but one does not explicitly know the distribution of g.The form of the law can...

    • Second moment method
      Second moment method
      In mathematics, the second moment method is a technique used in probability theory and analysis to show that a random variable has positive probability of being positive...

    • Variance
      Variance
      In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...

      • Coefficient of variation
        Coefficient of variation
        In probability theory and statistics, the coefficient of variation is a normalized measure of dispersion of a probability distribution. It is also known as unitized risk or the variation coefficient. The absolute value of the CV is sometimes known as relative standard deviation , which is...

      • Variance-to-mean ratio
    • Covariance function
      Covariance function
      In probability theory and statistics, covariance is a measure of how much two variables change together and the covariance function describes the variance of a random variable process or field...

    • An inequality on location and scale parameters
    • Taylor expansions for the moments of functions of random variables
      Taylor expansions for the moments of functions of random variables
      In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite...

    • Moment problem
  • Prior probability distribution
  • Total variation distance
    Total variation
    In mathematics, the total variation identifies several slightly different concepts, related to the structure of the codomain of a function or a measure...

  • Hellinger distance
    Hellinger distance
    In probability and statistics, the Hellinger distance is used to quantify the similarity between two probability distributions. It is a type of f-divergence...

  • Wasserstein metric
    Wasserstein metric
    In mathematics, the Wasserstein metric is a distance function defined between probability distributions on a given metric space M....

  • Lévy–Prokhorov metric
    • Lévy metric
      Lévy metric
      In mathematics, the Lévy metric is a metric on the space of cumulative distribution functions of one-dimensional random variables. It is a special case of the Lévy–Prokhorov metric, and is named after the French mathematician Paul Pierre Lévy.-Definition:...

  • Continuity correction
    Continuity correction
    In probability theory, if a random variable X has a binomial distribution with parameters n and p, i.e., X is distributed as the number of "successes" in n independent Bernoulli trials with probability p of success on each trial, then...

  • Heavy-tailed distribution
    Heavy-tailed distribution
    In probability theory, heavy-tailed distributions are probability distributions whose tails are not exponentially bounded: that is, they have heavier tails than the exponential distribution...

  • Truncated distribution
    Truncated distribution
    In statistics, a truncated distribution is a conditional distribution that results from restricting the domain of some other probability distribution. Truncated distributions arise in practical statistics in cases where the ability to record, or even to know about, occurrences is limited to values...

  • Infinite divisibility
    Infinite divisibility
    The concept of infinite divisibility arises in different ways in philosophy, physics, economics, order theory , and probability theory...

  • Stability (probability)
    Stability (probability)
    In probability theory, the stability of a random variable is the property that a linear combination of two independent copies of the variable has the same distribution, up to location and scale parameters. The distributions of random variables having this property are said to be "stable...

  • Indecomposable distribution
    Indecomposable distribution
    In probability theory, an indecomposable distribution is a probability distribution that cannot be represented as the distribution of the sum of two or more non-constant independent random variables: Z ≠ X + Y. If it can be so expressed, it is decomposable:...

  • Power law
    Power law
    A power law is a special kind of mathematical relationship between two quantities. When the frequency of an event varies as a power of some attribute of that event , the frequency is said to follow a power law. For instance, the number of cities having a certain population size is found to vary...

  • Anderson's theorem
    Anderson's theorem
    In mathematics, Anderson's theorem is a result in real analysis and geometry which says that the integral of an integrable, symmetric, unimodal, non-negative function f over an n-dimensional convex body K does not decrease if K is translated inwards towards the origin...


Discrete probability distributions

  • Bose-Einstein statistics
  • Fermi-Dirac statistics
    Fermi-Dirac statistics
    Fermi–Dirac statistics is a part of the science of physics that describes the energies of single particles in a system comprising many identical particles that obey the Pauli Exclusion Principle...

  • Bernoulli distribution
    • Bernoulli trial
      Bernoulli trial
      In the theory of probability and statistics, a Bernoulli trial is an experiment whose outcome is random and can be either of two possible outcomes, "success" and "failure"....

  • Binomial distribution
    • Binomial probability
      Binomial probability
      Binomial probability typically deals with the probability of several successive decisions, each of which has two possible outcomes.- Definition :...

  • Coupon collector's problem
    Coupon collector's problem
    In probability theory, the coupon collector's problem describes the "collect all coupons and win" contests. It asks the following question: Suppose that there are n coupons, from which coupons are being collected with replacement...

  • Degenerate distribution
  • Dirichlet distribution
  • Geometric distribution
  • Graphical model
    Graphical model
    A graphical model is a probabilistic model for which a graph denotes the conditional independence structure between random variables. They are commonly used in probability theory, statistics—particularly Bayesian statistics—and machine learning....

  • Hypergeometric distribution
  • Maxwell-Boltzmann statistics
  • Multinomial distribution
  • Negative binomial distribution
    Negative binomial distribution
    In probability theory and statistics, the negative binomial distribution is a discrete probability distribution of the number of successes in a sequence of Bernoulli trials before a specified number of failures occur...

  • Negative hypergeometric distribution
  • Poisson distribution
    Poisson distribution
    In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since...

    • Compound Poisson distribution
      Compound Poisson distribution
      In probability theory, a compound Poisson distribution is the probability distribution of the sum of a "Poisson-distributed number" of independent identically-distributed random variables...

  • Poisson binomial distribution
  • (a,b,0) class of distributions
    (a,b,0) class of distributions
    In probability theory, the distribution of a discrete random variable N is said to be a member of the class of distributions if its probability mass function obeyswhere p_k = P ....


Continuous probability distributions


Properties of probability distributions

  • Central limit theorem
    Central limit theorem
    In probability theory, the central limit theorem states conditions under which the mean of a sufficiently large number of independent random variables, each with finite mean and variance, will be approximately normally distributed. The central limit theorem has a number of variants. In its common...

    • Illustration of the central limit theorem
      Illustration of the central limit theorem
      This article gives two concrete illustrations of the central limit theorem. Both involve the sum of independent and identically-distributed random variables and show how the probability distribution of the sum approaches the normal distribution as the number of terms in the sum increases.The first...

    • Concrete illustration of the central limit theorem
    • Berry-Esséen theorem
    • Berry–Esséen theorem
      Berry–Esséen theorem
      The central limit theorem in probability theory and statistics states that under certain circumstances the sample mean, considered as a random quantity, becomes more normally distributed as the sample size is increased...

    • De Moivre–Laplace theorem
      De Moivre–Laplace theorem
      In probability theory, the de Moivre–Laplace theorem is a normal approximation to the binomial distribution. It is a special case of the central limit theorem...

    • Lyapunov's central limit theorem
    • Martingale central limit theorem
      Martingale central limit theorem
      In probability theory, the central limit theorem says that, under certain conditions, the sum of many independent identically-distributed random variables, when scaled appropriately, converges in distribution to a standard normal distribution...

    • Infinite divisibility (probability)
      Infinite divisibility (probability)
      The concepts of infinite divisibility and the decomposition of distributions arise in probability and statistics in relation to seeking families of probability distributions that might be a natural choice in certain applications, in the same way that the normal distribution is...

    • Method of moments (probability theory)
    • Stability (probability)
      Stability (probability)
      In probability theory, the stability of a random variable is the property that a linear combination of two independent copies of the variable has the same distribution, up to location and scale parameters. The distributions of random variables having this property are said to be "stable...

    • Stein's lemma
      Stein's lemma
      Stein's lemma, named in honor of Charles Stein, is a theorem of probability theory that is of interest primarily because of its applications to statistical inference — in particular, to James–Stein estimation and empirical Bayes methods — and its applications to portfolio choice...

  • Characteristic function (probability theory)
    Characteristic function (probability theory)
    In probability theory and statistics, the characteristic function of any random variable completely defines its probability distribution. Thus it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative...

    • Lévy continuity theorem
      Lévy continuity theorem
      In probability theory, the Lévy’s continuity theorem, named after the French mathematician Paul Lévy, connects convergence in distribution of the sequence of random variables with pointwise convergence of their characteristic functions...

  • Edgeworth series
    Edgeworth series
    The Gram–Charlier A series , and the Edgeworth series are series that approximate a probability distribution in terms of its cumulants...

  • Helly-Bray theorem
  • Location parameter
    Location parameter
    In statistics, a location family is a class of probability distributions that is parametrized by a scalar- or vector-valued parameter μ, which determines the "location" or shift of the distribution...

  • Maxwell's theorem
    Maxwell's theorem
    In probability theory, Maxwell's theorem, named in honor of James Clerk Maxwell, states that if the probability distribution of a vector-valued random variable X = T is the same as the distribution of GX for every n×n orthogonal matrix G and the components are independent, then the components...

  • Moment-generating function
    Moment-generating function
    In probability theory and statistics, the moment-generating function of any random variable is an alternative definition of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or...

  • Negative probability
    Negative probability
    In 1942, Paul Dirac wrote a paper "The Physical Interpretation of Quantum Mechanics" where he introduced the concept of negative energies and negative probabilities:...

  • Probability-generating function
    Probability-generating function
    In probability theory, the probability-generating function of a discrete random variable is a power series representation of the probability mass function of the random variable...

  • Vysochanskiï-Petunin inequality
    Vysochanskiï-Petunin inequality
    In probability theory, the Vysochanskij–Petunin inequality gives a lower bound for the probability that a random variable with finite variance lies within a certain number of standard deviations of the variable's mean, or equivalently an upper bound for the probability that it lies further away....

  • Mutual information
    Mutual information
    In probability theory and information theory, the mutual information of two random variables is a quantity that measures the mutual dependence of the two random variables...

  • Kullback-Leibler divergence
  • Normally distributed and uncorrelated does not imply independent
    Normally distributed and uncorrelated does not imply independent
    In probability theory, two random variables being uncorrelated does not imply their independence. In some contexts, uncorrelatedness implies at least pairwise independence ....

  • Le Cam's theorem
    Le Cam's theorem
    In probability theory, Le Cam's theorem, named after Lucien le Cam , is as follows.Suppose:* X1, ..., Xn are independent random variables, each with a Bernoulli distribution , not necessarily identically distributed.* Pr = pi for i = 1, 2, 3, ...* \lambda_n = p_1 + \cdots + p_n.\,* S_n = X_1...

  • Large deviations theory
    Large deviations theory
    In probability theory, the theory of large deviations concerns the asymptotic behaviour of remote tails of sequences of probability distributions. Some basic ideas of the theory can be tracked back to Laplace and Cramér, although a clear unified formal definition was introduced in 1966 by Varadhan...

    • Contraction principle (large deviations theory)
      Contraction principle (large deviations theory)
      In mathematics — specifically, in large deviations theory — the contraction principle is a theorem that states how a large deviation principle on one space "pushes forward" to a large deviation principle on another space via a continuous function.-Statement of the theorem:Let X and Y be...

    • Varadhan's lemma
      Varadhan's lemma
      In mathematics, Varadhan's lemma is a result in large deviations theory named after S. R. Srinivasa Varadhan. The result gives information on the asymptotic distribution of a statistic φ of a family of random variables Zε as ε becomes small in terms of a rate function for the variables.-Statement...

    • Tilted large deviation principle
      Tilted large deviation principle
      In mathematics — specifically, in large deviations theory — the tilted large deviation principle is a result that allows one to generate a new large deviation principle from an old one by "tilting", i.e. integration against an exponential functional...

    • Rate function
      Rate function
      In mathematics — specifically, in large deviations theory — a rate function is a function used to quantify the probabilities of rare events. It is required to have several "nice" properties which assist in the formulation of the large deviation principle...

    • Laplace principle (large deviations theory)
      Laplace principle (large deviations theory)
      In mathematics, Laplace's principle is a basic theorem in large deviations theory, similar to Varadhan's lemma. It gives an asymptotic expression for the Lebesgue integral of exp over a fixed set A as θ becomes large...

    • Exponentially equivalent measures
      Exponentially equivalent measures
      In mathematics, the notion of exponential equivalence of measures is a concept that describes how two sequences or families of probability measures are “the same” from the point of view of large deviations theory.-Definition:...

    • Cramér's theorem
      Cramér's theorem
      In mathematical statistics, Cramér's theorem is one of several theorems of Harald Cramér, a Swedish statistician and probabilist.- Normal random variables :...

       (second part)

Applied probability
Applied probability
Much research involving probability is done under the auspices of applied probability, the application of probability theory to other scientific and engineering domains...

  • Empirical findings
    • Benford's law
      Benford's law
      Benford's law, also called the first-digit law, states that in lists of numbers from many real-life sources of data, the leading digit is distributed in a specific, non-uniform way...

    • Pareto principle
      Pareto principle
      The Pareto principle states that, for many events, roughly 80% of the effects come from 20% of the causes.Business-management consultant Joseph M...

    • Zipf's law

Stochastic process
Stochastic process
In probability theory, a stochastic process , or sometimes random process, is the counterpart to a deterministic process...

es

  • Adapted process
    Adapted process
    In the study of stochastic processes, an adapted process is one that cannot "see into the future". An informal interpretation is that X is adapted if and only if, for every realisation and every n, Xn is known at time n...

  • Bernoulli process
    Bernoulli process
    In probability and statistics, a Bernoulli process is a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1. The component Bernoulli variables Xi are identical and independent...

    • Bernoulli scheme
      Bernoulli scheme
      In mathematics, the Bernoulli scheme or Bernoulli shift is a generalization of the Bernoulli process to more than two possible outcomes. Bernoulli schemes are important in the study of dynamical systems, as most such systems exhibit a repellor that is the product of the Cantor set and a smooth...

  • Branching process
    Branching process
    In probability theory, a branching process is a Markov process that models a population in which each individual in generation n produces some random number of individuals in generation n + 1, according to a fixed probability distribution that does not vary from individual to...

  • Point process
    Point process
    In statistics and probability theory, a point process is a type of random process for which any one realisation consists of a set of isolated points either in time or geographical space, or in even more general spaces...

  • Wiener process
    Wiener process
    In mathematics, the Wiener process is a continuous-time stochastic process named in honor of Norbert Wiener. It is often called standard Brownian motion, after Robert Brown...

    • Brownian motion
      Brownian motion
      Brownian motion or pedesis is the presumably random drifting of particles suspended in a fluid or the mathematical model used to describe such random movements, which is often called a particle theory.The mathematical model of Brownian motion has several real-world applications...

    • Geometric Brownian motion
      Geometric Brownian motion
      A geometric Brownian motion is a continuous-time stochastic process in which the logarithm of the randomly varying quantity follows a Brownian motion, also called a Wiener process...

    • Donsker's theorem
      Donsker's theorem
      In probability theory, Donsker's theorem, named after M. D. Donsker, identifies a certain stochastic process as a limit of empirical processes. It is sometimes called the functional central limit theorem....

    • Empirical process
      Empirical process
      The study of empirical processes is a branch of mathematical statistics and a sub-area of probability theory. It is a generalization of the central limit theorem for empirical measures...

    • Wiener equation
    • Wiener sausage
      Wiener sausage
      In the mathematical field of probability, the Wiener sausage is a neighborhood of the trace of a Brownian motion up to a time t, given by taking all points within a fixed distance of Brownian motion. It can be visualized as a sausage of fixed radius whose centerline is Brownian motion...

  • Chapman–Kolmogorov equation
  • Chinese restaurant process
  • Coupling (probability)
    Coupling (probability)
    In probability theory, coupling is a proof technique that allows one to compare two unrelated variables by "forcing" them to be related in some way.-Definition:...

  • Ergodic theory
    Ergodic theory
    Ergodic theory is a branch of mathematics that studies dynamical systems with an invariant measure and related problems. Its initial development was motivated by problems of statistical physics....

    • Maximal ergodic theorem
    • Ergodic (adjective)
      Ergodic (adjective)
      In mathematics, the term ergodic is used to describe a dynamical system which, broadly speaking, has the same behavior averaged over time as averaged over space. In physics the term is used to imply that a system satisfies the ergodic hypothesis of thermodynamics.-Etymology:The word ergodic is...

  • Galton–Watson process
  • Gauss-Markov process
  • Gaussian process
    Gaussian process
    In probability theory and statistics, a Gaussian process is a stochastic process whose realisations consist of random values associated with every point in a range of times such that each such random variable has a normal distribution...

    • Gaussian random field
      Gaussian random field
      A Gaussian random field is a random field involving Gaussian probability density functions of the variables. A one-dimensional GRF is also called a Gaussian process....

    • Gaussian isoperimetric inequality
    • Large deviations of Gaussian random functions
      Large deviations of Gaussian random functions
      A random function – of either one variable , or two or more variables – is called Gaussian if every finite-dimensional distribution is a multivariate normal distribution. Gaussian random fields on the sphere are useful when analysing* the anomalies in the cosmic microwave background...

  • Girsanov's theorem
  • Itô's lemma
    Ito's lemma
    In mathematics, Itō's lemma is used in Itō stochastic calculus to find the differential of a function of a particular type of stochastic process. It is named after its discoverer, Kiyoshi Itō...

  • Law of the iterated logarithm
    Law of the iterated logarithm
    In probability theory, the law of the iterated logarithm describes the magnitude of the fluctuations of a random walk. The original statement of the law of the iterated logarithm is due to A. Y. Khinchin . Another statement was given by A.N...

  • Lévy flight
    Lévy flight
    A Lévy flight is a random walk in which the step-lengths have a probability distribution that is heavy-tailed. When defined as a walk in a space of dimension greater than one, the steps made are in isotropic random directions...

  • Lévy process
    Lévy process
    In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is any continuous-time stochastic process that starts at 0, admits càdlàg modification and has "stationary independent increments" — this phrase will be explained below...

  • Loop-erased random walk
    Loop-erased random walk
    In mathematics, loop-erased random walk is a model for a random simple path with important applications in combinatorics and, in physics, quantum field theory. It is intimately connected to the uniform spanning tree, a model for a random tree...

  • Markov chain
    Markov chain
    A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the...

    • Continuous-time Markov process
    • Examples of Markov chains
      Examples of Markov chains
      - Board games played with dice :A game of snakes and ladders or any other game whose moves are determined entirely by dice is a Markov chain, indeed, an absorbing Markov chain. This is in contrast to card games such as blackjack, where the cards represent a 'memory' of the past moves. To see the...

    • Detailed balance
      Detailed balance
      The principle of detailed balance is formulated for kinetic systems which are decomposed into elementary processes : At equilibrium, each elementary process should be equilibrated by its reverse process....

    • Markov property
      Markov property
      In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It was named after the Russian mathematician Andrey Markov....

    • Hidden Markov model
      Hidden Markov model
      A hidden Markov model is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved states. An HMM can be considered as the simplest dynamic Bayesian network. The mathematics behind the HMM was developed by L. E...

    • Markov chain mixing time
      Markov chain mixing time
      In probability theory, the mixing time of a Markov chain is the time until the Markov chain is "close" to its steady state distribution.More precisely, a fundamental result about Markov chains is that a finite state irreducible aperiodic chain has a unique stationary distribution π and,...

  • Martingale
    Martingale (probability theory)
    In probability theory, a martingale is a model of a fair game where no knowledge of past events can help to predict future winnings. In particular, a martingale is a sequence of random variables for which, at a particular time in the realized sequence, the expectation of the next value in the...

    • Doob martingale
      Doob martingale
      A Doob martingale is a mathematical construction of a stochastic process which approximates a given random variable and has the martingale property with respect to the given filtration...

    • Optional stopping theorem
      Optional stopping theorem
      In probability theory, the optional stopping theorem says that, under certain conditions, the expected value of a martingale at a stopping time is equal to its initial value...

    • Martingale representation theorem
      Martingale representation theorem
      In probability theory, the martingale representation theorem states that a random variable which is measurable with respect to the filtration generated by a Brownian motion can be written in terms of an Itô integral with respect to this Brownian motion....

    • Azuma's inequality
    • Wald's equation
      Wald's equation
      In probability theory, Wald's equation, Wald's identity or Wald's lemma is an important identity that simplifies the calculation of the expected value of the sum of a random number of random quantities...

  • Poisson process
    Poisson process
    A Poisson process, named after the French mathematician Siméon-Denis Poisson , is a stochastic process in which events occur continuously and independently of one another...

  • Population process
    Population process
    In applied probability, a population process is a Markov chain in which the state of the chain is analogous to the number of individuals in a population , and changes to the state are analogous to the addition or removal of individuals from the population.Although named by analogy to biological...

  • Process with independent increments
  • Progressively measurable process
    Progressively measurable process
    In mathematics, progressive measurability is a property of stochastic processes. A progressively measurable process is one for which events defined in terms of values of the process across a range of times can be assigned probabilities . Being progressively measurable is a strictly stronger...

  • Queueing theory
    Queueing theory
    Queueing theory is the mathematical study of waiting lines, or queues. The theory enables mathematical analysis of several related processes, including arriving at the queue, waiting in the queue , and being served at the front of the queue...

    • Erlang unit
      Erlang unit
      The erlang is a dimensionless unit that is used in telephony as a statistical measure of offered load or carried load on service-providing elements such as telephone circuits or telephone switching equipment. It is named after the Danish telephone engineer A. K...

  • Random walk
    Random walk
    A random walk, sometimes denoted RW, is a mathematical formalisation of a trajectory that consists of taking successive random steps. For example, the path traced by a molecule as it travels in a liquid or a gas, the search path of a foraging animal, the price of a fluctuating stock and the...

  • Random walk Monte Carlo
  • Skorokhod's embedding theorem
    Skorokhod's embedding theorem
    In mathematics and probability theory, Skorokhod's embedding theorem is either or both of two theorems that allow one to regard any suitable collection of random variables as a Wiener process evaluated at a collection of stopping times. Both results are named for the Ukrainian mathematician A.V...

  • Stationary process
    Stationary process
    In the mathematical sciences, a stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space...

  • Stochastic calculus
    Stochastic calculus
    Stochastic calculus is a branch of mathematics that operates on stochastic processes. It allows a consistent theory of integration to be defined for integrals of stochastic processes with respect to stochastic processes...

    • Itô calculus
      Ito calculus
      Itō calculus, named after Kiyoshi Itō, extends the methods of calculus to stochastic processes such as Brownian motion . It has important applications in mathematical finance and stochastic differential equations....

    • Malliavin calculus
      Malliavin calculus
      The Malliavin calculus, named after Paul Malliavin, is a theory of variational stochastic calculus. In other words it provides the mechanics to compute derivatives of random variables....

    • Stratonovich integral
      Stratonovich integral
      In stochastic processes, the Stratonovich integral is a stochastic integral, the most common alternative to the Itō integral...

  • Time series analysis
    • Autoregressive model
      Autoregressive model
      In statistics and signal processing, an autoregressive model is a type of random process which is often used to model and predict various types of natural phenomena...

    • Moving average model
      Moving average model
      In time series analysis, the moving-average model is a common approach for modeling univariate time series models. The notation MA refers to the moving average model of order q:...

    • Autoregressive moving average model
      Autoregressive moving average model
      In statistics and signal processing, autoregressive–moving-average models, sometimes called Box–Jenkins models after the iterative Box–Jenkins methodology usually used to estimate them, are typically applied to autocorrelated time series data.Given a time series of data Xt, the ARMA model is a...

    • Autoregressive integrated moving average model
    • Anomaly time series
      Anomaly time series
      In atmospheric sciences and some other applications of statistics, an anomaly time series is the time series of deviations of a quantity from some mean. Similarly a standardized anomaly series contains values of deviations divided by a standard deviation...

  • Renewal theory
    Renewal theory
    Renewal theory is the branch of probability theory that generalizes Poisson processes for arbitrary holding times. Applications include calculating the expected time for a monkey who is randomly tapping at a keyboard to type the word Macbeth and comparing the long-term benefits of different...


Geometric probability

  • Buffon's needle
    Buffon's needle
    In mathematics, Buffon's needle problem is a question first posed in the 18th century by Georges-Louis Leclerc, Comte de Buffon:Buffon's needle was the earliest problem in geometric probability to be solved; it can be solved using integral geometry...

  • Integral geometry
    Integral geometry
    In mathematics, integral geometry is the theory of measures on a geometrical space invariant under the symmetry group of that space. In more recent times, the meaning has been broadened to include a view of invariant transformations from the space of functions on one geometrical space to the...

  • Hadwiger's theorem
    Hadwiger's theorem
    In integral geometry , Hadwiger's theorem characterises the valuations on convex bodies in Rn. It was proved by Hugo Hadwiger.-Valuations:...

  • Wendel's theorem
    Wendel's theorem
    In geometric probability theory, Wendel's theorem, named after James G. Wendel, gives the probability that N points distributed uniformly at random on an n-dimensional hypersphere all lie on the same "half" of the hypersphere...


Gambling
Gambling
Gambling is the wagering of money or something of material value on an event with an uncertain outcome with the primary intent of winning additional money and/or material goods...

  • Luck
    Luck
    Luck or fortuity is good fortune which occurs beyond one's control, without regard to one's will, intention, or desired result. There are at least two senses people usually mean when they use the term, the prescriptive sense and the descriptive sense...

  • Game of chance
    Game of chance
    A game of chance is a game whose outcome is strongly influenced by some randomizing device, and upon which contestants may or may not wager money or anything of monetary value...

  • Odds
    Odds
    The odds in favor of an event or a proposition are expressed as the ratio of a pair of integers, which is the ratio of the probability that an event will happen to the probability that it will not happen...

  • Gambler's fallacy
    Gambler's fallacy
    The Gambler's fallacy, also known as the Monte Carlo fallacy , and also referred to as the fallacy of the maturity of chances, is the belief that if deviations from expected behaviour are observed in repeated independent trials of some random process, future deviations in the opposite direction are...

  • Inverse gambler's fallacy
    Inverse gambler's fallacy
    The inverse gambler's fallacy, named by philosopher Ian Hacking, is a formal fallacy of Bayesian inference which is similar to the better known gambler's fallacy. It is the fallacy of concluding, on the basis of an unlikely outcome of a random process, that the process is likely to have occurred...

  • Parrondo's paradox
    Parrondo's paradox
    Parrondo's paradox, a paradox in game theory, has been described as: A losing strategy that wins. It is named after its creator, Spanish physicist Juan Parrondo, who discovered the paradox in 1996...

  • Pascal's wager
    Pascal's Wager
    Pascal's Wager, also known as Pascal's Gambit, is a suggestion posed by the French philosopher, mathematician, and physicist Blaise Pascal that even if the existence of God could not be determined through reason, a rational person should wager as though God exists, because one living life...

  • Gambler's ruin
    Gambler's ruin
    The term gambler's ruin is used for a number of related statistical ideas:* The original meaning is that a gambler who raises his bet to a fixed fraction of bankroll when he wins, but does not reduce it when he loses, will eventually go broke, even if he has a positive expected value on each bet.*...

  • Poker probability
    Poker probability
    In poker, the probability of each type of 5-card hand can be computed by calculating the proportion of hands of that type among all possible hands.-Frequency of 5-card poker hands:...

    • Poker probability (Omaha)
      Poker probability (Omaha)
      In poker, the probability of many events can be determined by direct calculation. This article discusses how to compute the probabilities for many commonly occurring events in the game of Omaha hold 'em and provides some probabilities and odds for specific situations...

    • Poker probability (Texas hold 'em)
      Poker probability (Texas hold 'em)
      In poker, the probability of many events can be determined by direct calculation. This article discusses computing probabilities for many commonly occurring events in the game of Texas hold 'em and provides some probabilities and odds for specific situations...

    • Pot odds
      Pot odds
      In poker, pot odds are the ratio of the current size of the pot to the cost of a contemplated call. Pot odds are often compared to the probability of winning a hand with a future card in order to estimate the call's expected value....

  • Roulette
    Roulette
    Roulette is a casino game named after a French diminutive for little wheel. In the game, players may choose to place bets on either a single number or a range of numbers, the colors red or black, or whether the number is odd or even....

    • Martingale (betting system)
      Martingale (betting system)
      Originally, martingale referred to a class of betting strategies popular in 18th century France. The simplest of these strategies was designed for a game in which the gambler wins his stake if a coin comes up heads and loses it if the coin comes up tails...

    • The man who broke the bank at Monte Carlo
      The Man Who Broke the Bank at Monte Carlo
      The Man Who Broke the Bank at Monte Carlo is a 1935 American romantic comedy film made by 20th Century Fox. It was directed by Stephen Roberts, and starred Ronald Colman, Joan Bennett, and Colin Clive. The screenplay was written by Nunnally Johnson and Howard Smith, based on play by Ilya Surguchev...

  • Lottery
    Lottery
    A lottery is a form of gambling which involves the drawing of lots for a prize.Lottery is outlawed by some governments, while others endorse it to the extent of organizing a national or state lottery. It is common to find some degree of regulation of lottery by governments...

    • Lottery machine
      Lottery machine
      A lottery machine is the machine used to draw the winning numbers for a lottery.Early lotteries were done by drawing numbers, or winning tickets, from a container...

    • Pachinko
      Pachinko
      is a type of game originating in Japan, and used as both a form of recreational arcade game and much more frequently as a gambling device, filling a niche in gambling in Japan comparable to that of the slot machine in Western gambling. A pachinko machine resembles a vertical pinball machine, but...

  • Coherence (philosophical gambling strategy)
    Coherence (philosophical gambling strategy)
    In a thought experiment proposed by the Italian probabilist Bruno de Finetti in order to justify Bayesian probability, an array of wagers is coherent precisely if it does not expose the wagerer to certain loss regardless of the outcomes of events on which he is wagering, even if his opponent makes...

  • Coupon collector's problem
    Coupon collector's problem
    In probability theory, the coupon collector's problem describes the "collect all coupons and win" contests. It asks the following question: Suppose that there are n coupons, from which coupons are being collected with replacement...


Coincidence

  • Birthday paradox
    Birthday paradox
    In probability theory, the birthday problem or birthday paradox pertains to the probability that, in a set of n randomly chosen people, some pair of them will have the same birthday. By the pigeonhole principle, the probability reaches 100% when the number of people reaches 366. However, 99%...

    • Birthday problem
  • Index of coincidence
    Index of coincidence
    In cryptography, coincidence counting is the technique of putting two texts side-by-side and counting the number of times that identical letters appear in the same position in both texts...

  • Bible code
    Bible code
    The Bible code , also known as the Torah code, is a purported set of secret messages encoded within the text Hebrew Bible and describing prophesies and other guidance regarding the future. This hidden code has been described as a method by which specific letters from the text can be selected to...

  • Spurious relationship
    Spurious relationship
    In statistics, a spurious relationship is a mathematical relationship in which two events or variables have no direct causal connection, yet it may be wrongly inferred that they do, due to either coincidence or the presence of a certain third, unseen factor In statistics, a spurious relationship...


Algorithmics

  • Probable prime
    Probable prime
    In number theory, a probable prime is an integer that satisfies a specific condition also satisfied by all prime numbers. Different types of probable primes have different specific conditions...

  • Probabilistic algorithm = Randomised algorithm
  • Monte Carlo method
    Monte Carlo method
    Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in computer simulations of physical and mathematical systems...

  • Las Vegas algorithm
    Las Vegas algorithm
    In computing, a Las Vegas algorithm is a randomized algorithm that always gives correct results; that is, it always produces the correct result or it informs about the failure. In other words, a Las Vegas algorithm does not gamble with the verity of the result; it gambles only with the resources...

  • Probabilistic Turing machine
    Probabilistic Turing machine
    In computability theory, a probabilistic Turing machine is a non-deterministic Turing machine which randomly chooses between the available transitions at each point according to some probability distribution....

  • Stochastic programming
    Stochastic programming
    Stochastic programming is a framework for modeling optimization problems that involve uncertainty. Whereas deterministic optimization problems are formulated with known parameters, real world problems almost invariably include some unknown parameters. When the parameters are known only within...

  • Probabilistically checkable proof
  • Box–Muller transform
  • Metropolis algorithm
  • Gibbs sampling
    Gibbs sampling
    In statistics and in statistical physics, Gibbs sampling or a Gibbs sampler is an algorithm to generate a sequence of samples from the joint probability distribution of two or more random variables...

  • Inverse transform sampling method
    Inverse transform sampling method
    Inverse transform sampling, also known as the inverse probability integral transform or inverse transformation method or Smirnov transform or even golden rule, is a basic method for pseudo-random number sampling, i.e. for generating sample numbers at random from any probability distribution given...


Financial mathematics

  • Risk
    Risk
    Risk is the potential that a chosen action or activity will lead to a loss . The notion implies that a choice having an influence on the outcome exists . Potential losses themselves may also be called "risks"...

  • Value at risk
    Value at risk
    In financial mathematics and financial risk management, Value at Risk is a widely used risk measure of the risk of loss on a specific portfolio of financial assets...

  • Market risk
    Market risk
    Market risk is the risk that the value of a portfolio, either an investment portfolio or a trading portfolio, will decrease due to the change in value of the market risk factors. The four standard market risk factors are stock prices, interest rates, foreign exchange rates, and commodity prices...

  • Risk-neutral measure
    Risk-neutral measure
    In mathematical finance, a risk-neutral measure, is a prototypical case of an equivalent martingale measure. It is heavily used in the pricing of financial derivatives due to the fundamental theorem of asset pricing, which implies that in a complete market a derivative's price is the discounted...

  • Volatility
    Volatility (finance)
    In finance, volatility is a measure for variation of price of a financial instrument over time. Historic volatility is derived from time series of past market prices...

  • Technical analysis
    Technical analysis
    In finance, technical analysis is security analysis discipline for forecasting the direction of prices through the study of past market data, primarily price and volume. Behavioral economics and quantitative analysis incorporate technical analysis, which being an aspect of active management stands...

  • Kelly criterion
    Kelly criterion
    In probability theory, the Kelly criterion, or Kelly strategy or Kelly formula, or Kelly bet, is a formula used to determine the optimal size of a series of bets. In most gambling scenarios, and some investing scenarios under some simplifying assumptions, the Kelly strategy will do better than any...


Physics

  • Probability amplitude
    Probability amplitude
    In quantum mechanics, a probability amplitude is a complex number whose modulus squared represents a probability or probability density.For example, if the probability amplitude of a quantum state is \alpha, the probability of measuring that state is |\alpha|^2...

  • Statistical physics
    Statistical physics
    Statistical physics is the branch of physics that uses methods of probability theory and statistics, and particularly the mathematical tools for dealing with large populations and approximations, in solving physical problems. It can describe a wide variety of fields with an inherently stochastic...

  • Boltzmann factor
    Boltzmann factor
    In physics, the Boltzmann factor is a weighting factor that determines the relative probability of a particle to be in a state i in a multi-state system in thermodynamic equilibrium at temperature T...

  • Feynman-Kac formula
    Feynman-Kac formula
    The Feynman–Kac formula, named after Richard Feynman and Mark Kac, establishes a link between parabolic partial differential equations and stochastic processes. It offers a method of solving certain PDEs by simulating random paths of a stochastic process. Conversely, an important class of...

  • Fluctuation theorem
    Fluctuation theorem
    The fluctuation theorem , which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium will increase or decrease over a given amount of time...

  • Information entropy
    Information entropy
    In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits...

  • Vacuum expectation value
    Vacuum expectation value
    In quantum field theory the vacuum expectation value of an operator is its average, expected value in the vacuum. The vacuum expectation value of an operator O is usually denoted by \langle O\rangle...

  • Cosmic variance
    Cosmic variance
    Cosmic variance is the statistical uncertainty inherent in observations of the universe at extreme distances. It is based on the idea that it is only possible to observe part of the universe at one particular time, so it is difficult to make statistical statements about cosmology on the scale of...

  • Negative probability
    Negative probability
    In 1942, Paul Dirac wrote a paper "The Physical Interpretation of Quantum Mechanics" where he introduced the concept of negative energies and negative probabilities:...

  • Gibbs state
    Gibbs state
    In probability theory and statistical mechanics, a Gibbs state is an equilibrium probability distribution which remains invariant under future evolution of the system...

  • Master equation
    Master equation
    In physics and chemistry and related fields, master equations are used to describe the time-evolution of a system that can be modelled as being in exactly one of countable number of states at any given time, and where switching between states is treated probabilistically...

  • Partition function (mathematics)
    Partition function (mathematics)
    The partition function or configuration integral, as used in probability theory, information science and dynamical systems, is an abstraction of the definition of a partition function in statistical mechanics. It is a special case of a normalizing constant in probability theory, for the Boltzmann...

  • Quantum probability
    Quantum probability
    Quantum probability was developed in the 1980s as a noncommutative analog of the Kolmogorovian theory of stochastic processes. One of its aims is to clarify the mathematical foundations of quantum theory and its statistical interpretation....


Genetics
Genetics
Genetics , a discipline of biology, is the science of genes, heredity, and variation in living organisms....

  • Punnett square
    Punnett square
    The Punnett square is a diagram that is used to predict an outcome of a particular cross or breeding experiment. It is named after Reginald C. Punnett, who devised the approach, and is used by biologists to determine the probability of an offspring's having a particular genotype...

  • Hardy–Weinberg principle
  • Ewens's sampling formula
    Ewens's sampling formula
    In population genetics, Ewens' sampling formula, describes the probabilities associated with counts of how many different alleles are observed a given number of times in the sample.-Definition:...

  • Population genetics
    Population genetics
    Population genetics is the study of allele frequency distribution and change under the influence of the four main evolutionary processes: natural selection, genetic drift, mutation and gene flow. It also takes into account the factors of recombination, population subdivision and population...

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK