Central moment
Encyclopedia
In probability theory
Probability theory
Probability theory is the branch of mathematics concerned with analysis of random phenomena. The central objects of probability theory are random variables, stochastic processes, and events: mathematical abstractions of non-deterministic events or measured quantities that may either be single...

 and statistics
Statistics
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....

, central moments form one set of values by which the properties of a probability distribution
Probability distribution
In probability theory, a probability mass, probability density, or probability distribution is a function that describes the probability of a random variable taking certain values....

 can be usefully characterised. Central moments are used in preference to ordinary moments because then the values' higher order quantities relate only to the spread and shape of the distribution, rather than to its location
Location parameter
In statistics, a location family is a class of probability distributions that is parametrized by a scalar- or vector-valued parameter μ, which determines the "location" or shift of the distribution...

.

Sets of central moments can be defined for both univariate and multivariate distributions.

Univariate moments

The kth moment
Moment (mathematics)
In mathematics, a moment is, loosely speaking, a quantitative measure of the shape of a set of points. The "second moment", for example, is widely used and measures the "width" of a set of points in one dimension or in higher dimensions measures the shape of a cloud of points as it could be fit by...

 about the mean
Mean
In statistics, mean has two related meanings:* the arithmetic mean .* the expected value of a random variable, which is also called the population mean....

(or kth central moment) of a real-valued random variable
Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...

 X is the quantity μk := E[(X − E[X])k], where E is the expectation operator
Expected value
In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...

. For a continuous univariate
Univariate
In mathematics, univariate refers to an expression, equation, function or polynomial of only one variable. Objects of any of these types but involving more than one variable may be called multivariate...

 probability distribution
Probability distribution
In probability theory, a probability mass, probability density, or probability distribution is a function that describes the probability of a random variable taking certain values....

 with probability density function
Probability density function
In probability theory, a probability density function , or density of a continuous random variable is a function that describes the relative likelihood for this random variable to occur at a given point. The probability for the random variable to fall within a particular region is given by the...

 f(x) the moment about the mean μ is

For random variables that have no mean, such as the Cauchy distribution
Cauchy distribution
The Cauchy–Lorentz distribution, named after Augustin Cauchy and Hendrik Lorentz, is a continuous probability distribution. As a probability distribution, it is known as the Cauchy distribution, while among physicists, it is known as the Lorentz distribution, Lorentz function, or Breit–Wigner...

, central moments are not defined.

The first few central moments have intuitive interpretations:
  • The "zeroth" central moment μ0 is one.
  • The first central moment μ1 is zero (not to be confused with the first moment itself, the expected value or mean
    Expected value
    In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...

    ).
  • The second central moment μ2 is called the variance
    Variance
    In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...

    , and is usually denoted σ2, where σ represents the standard deviation
    Standard deviation
    Standard deviation is a widely used measure of variability or diversity used in statistics and probability theory. It shows how much variation or "dispersion" there is from the average...

    .
  • The third and fourth central moments are used to define the standardized moments which are used to define skewness
    Skewness
    In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. The skewness value can be positive or negative, or even undefined...

     and kurtosis
    Kurtosis
    In probability theory and statistics, kurtosis is any measure of the "peakedness" of the probability distribution of a real-valued random variable...

    , respectively.

Properties

The nth central moment is translation-invariant, i.e. for any random variable X and any constant c, we have


For all n, the nth central moment is homogeneous
Homogeneous polynomial
In mathematics, a homogeneous polynomial is a polynomial whose monomials with nonzero coefficients all have thesame total degree. For example, x^5 + 2 x^3 y^2 + 9 x y^4 is a homogeneous polynomial...

 of degree n:


Only for n ≤ 3 do we have an additivity property for random variables X and Y that are independent
Statistical independence
In probability theory, to say that two events are independent intuitively means that the occurrence of one event makes it neither more nor less probable that the other occurs...

:


A related functional that shares the translation-invariance and homogeneity properties with the nth central moment, but continues to have this additivity property even when n ≥ 4 is the nth cumulant
Cumulant
In probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have...

 κn(X). For n = 1, the nth cumulant is just the expected value
Expected value
In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...

; for n = either 2 or 3, the nth cumulant is just the nth central moment; for n ≥ 4, the nth cumulant is an nth-degree monic polynomial in the first n moments (about zero), and is also a (simpler) nth-degree polynomial in the first n central moments.

Relation to moments about the origin

Sometimes it is convenient to convert moments about the origin to moments about the mean. The general equation for converting the nth-order moment about the origin to the moment about the mean is


where μ is the mean of the distribution, and the moment about the origin is given by


For the cases n = 2, 3, 4 — which are of most interest because of the relations to variance
Variance
In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...

, skewness
Skewness
In probability theory and statistics, skewness is a measure of the asymmetry of the probability distribution of a real-valued random variable. The skewness value can be positive or negative, or even undefined...

, and kurtosis
Kurtosis
In probability theory and statistics, kurtosis is any measure of the "peakedness" of the probability distribution of a real-valued random variable...

, respectively — this formula becomes:



Multivariate moments

For a continuous bivariate probability distribution
Probability distribution
In probability theory, a probability mass, probability density, or probability distribution is a function that describes the probability of a random variable taking certain values....

 with probability density function
Probability density function
In probability theory, a probability density function , or density of a continuous random variable is a function that describes the relative likelihood for this random variable to occur at a given point. The probability for the random variable to fall within a particular region is given by the...

 f(x,y) the (j,k) moment about the mean μ = (μX, μY) is

See also

  • Image moment
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK