Kalman filter
The Kalman filter is basically a curve fitting technique; as a result, it is neither a MMSE estimato
Posts  1 - 1  of  1
Hikenstuff
I would like input on the follow:

Kalman’s estimator, the genesis of what is now commonly referred to as the Kalman filter, is mischaracterized by Kalman himself – and likewise the Kalman filter by others - as a minimum mean-square error (MMSE) estimator (although Kalman describes it in terms of minimizing the estimator “quadratic loss function” [1]). As well known in Statistics, the true mean square error (MSE) comprises the variance plus a bias squared. However, the Kalman filter is specifically designed to not allow for such a bias; therefore, it is also often mislabeled as a minimum variance unbiased estimator (MVUE). Nevertheless, the Kalman filter has been shown in several examples to be neither the MMSE estimator nor the MVUE as erroneously asserted universally. [2]

As acknowledged by Kalman in his Example 2, the Kalman filter is fundamentally a curve fitting technique. By minimizing the variance, the Kalman filter in essence minimizes the statistical deviations of measurement noise around the state equation. As a result, the state equation effectively becomes an "average curve". This is analogous (actually equivalent in the absence of process noise [3]) to Gauss' deterministic method of data fitting a polynomial as an "average curve".

On the other hand, it is the individual state variable estimates that are usually of primary importance, estimated position at a given point in time being generally the most important in tracking. Nevertheless, generating a Kalman filter “average curve” does not optimize any of the individual state variables

In the Kalman filter all the state variables are recursively estimated jointly as a dependent set in computing the state equation “average curve”. However, for any state variable estimate to be truly optimized, it is to be estimated independently of all others by minimizing the individual state variable MSE, comprising both variance and bias squared. This is demonstrated by both theory and example [2], where the true MMSE is derived for each of the tracking parameters position, velocity, and acceleration corresponding to the 1st, 2nd , and 3rd order Kalman filter state variables. (Higher order derivatives such as jerk, yank, and snatch, etc. can also be estimated.) Each parameter is individually expanded in terms of a set of orthogonal polynomials, which allows for the inclusion of a bias squared in each parameter MSE, rather than in terms of a Kalman state equation which does not allow for each parameter MSE, let alone a bias squared in each parameter MSE.

Please see reference [2] below for further info: http://site.infowest.com/personal/j/jeffbell/ANewLook.pdf


[1] Kalman, R.E. (1960). "A new approach to linear filtering and prediction problems". Journal of Basic Engineering 82 (1): 35–45. http://www.elo.utfsm.cl/~ipd481/Papers%20varios/kalman1960.pdf. Retrieved 2008-05-03.

[2] Bell, J.W., A New Look At Kalman Filter Tracking From The Perspective Of A Novel And Truly Optimal Non-Recursive Tracking Estimator. http://site.infowest.com/personal/j/jeffbell/ANewLook.pdf

[3] Brookner, E., Tracking and Kalman Filtering Made Easy, Wiley, New York, 1998.
Save
Cancel
Reply
 
x
OK