In probability theory
Probability theory is the branch of mathematics concerned with analysis of random phenomena. The central objects of probability theory are random variables, stochastic processes, and events: mathematical abstractions of non-deterministic events or measured quantities that may either be single...
, the law of total variance
or variance decomposition formula
states that if X
are random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
s on the same probability space
In probability theory, a probability space or a probability triple is a mathematical construct that models a real-world process consisting of states that occur randomly. A probability space is constructed with a specific kind of situation or experiment in mind...
, and the variance
In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...
is finite, then
In language perhaps better known to statisticians than to probabilists, the two terms are the "unexplained" and the "explained component of the variance" (cf. fraction of variance unexplained
In statistics, the fraction of variance unexplained in the context of a regression task is the fraction of variance of the regressand Y which cannot be explained, i.e., which is not correctly predicted, by the explanatory variables X....
, explained variation
In statistics, explained variation or explained randomness measures the proportion to which a mathematical model accounts for the variation of a given data set...
The nomenclature in this article's title parallels the phrase law of total probability
In probability theory, the law of total probability is a fundamental rule relating marginal probabilities to conditional probabilities.-Statement:The law of total probability is the proposition that if \left\...
. Some writers on probability call this the "conditional variance formula" or use other names.
Note that the conditional expected value is a random variable in its own right, whose value depends on the value of X
. Notice that the conditional expected value of Y
given the event X
is a function of y
(this is where adherence to the conventional rigidly case-sensitive notation of probability theory becomes important!). If we write E( Y
) = g
) then the random variable is just g
). Similar comments apply to the conditional variance
In probability theory and statistics, a conditional variance is the variance of a conditional probability distribution. Particularly in econometrics, the conditional variance is also known as the scedastic function or skedastic function...
The law of total variance can be proved using the law of total expectation
The proposition in probability theory known as the law of total expectation, the law of iterated expectations, the tower rule, the smoothing theorem, among other names, states that if X is an integrable random variable The proposition in probability theory known as the law of total expectation, ...
from the definition of variance. Then we apply the law of total expectation by conditioning on the random variable X
Now we rewrite the conditional second moment of Y in terms of its variance and first moment:
Since expectation of a sum is the sum of expectations, we can now regroup the terms:
Finally, we recognize the terms in parentheses as the variance of the conditional expectation E[Y
The square of the correlation
In cases where (Y, X)
are such that the conditional expected value is linear; i.e., in cases where
it follows from the bilinearity of Cov(-,-)
and the explained component of the variance divided by the total variance is just the square of the correlation
In statistics, dependence refers to any statistical relationship between two random variables or two sets of data. Correlation refers to any of a broad class of statistical relationships involving dependence....
; i.e., in such cases,
One example of this situation is when (Y, X)
have a bivariate normal (Gaussian) distribution.
A similar law for the third central moment
In probability theory and statistics, central moments form one set of values by which the properties of a probability distribution can be usefully characterised...
For higher cumulant
In probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. The moments determine the cumulants in the sense that any two probability distributions whose moments are identical will have...
s, a simple and elegant generalization exists. See law of total cumulance
In probability theory and mathematical statistics, the law of total cumulance is a generalization to cumulants of the law of total probability, the law of total expectation, and the law of total variance. It has applications in the analysis of time series...