Consistency of Estimation
Permanent address of the item is
Besides accuracy, consistency and information content are important properties of the estimation. It is crucial that estimator provide realistic information about possible estimation error, especially in fusion algorithms, where several estimates from different sources are merged into one. In this thesis consistency is defined in several ways and methods for its evaluation are introduced. Mean Squared Deviation consistency is based on Chebyshev’s inequality which defines lower bound of probability mass concentrated around mean of random variable. P consistency implies that concentration ellipse around mean of the estimate with probability mass p must contain actual value of estimated parameter with probability p. Normalized deviation squared consistency implies that concentration ellipse of any probability around mean of the estimate must contain actual value of estimated parameter with probability p. Information content is defined in terms of most informative estimate, estimate which has the highest precision and yet consistent. In this work statistical hypothesis testing framework is used for consistency and information content evaluation. Hypothesis tests for consistency evaluation are derived for static parameter estimation and state estimation (filtering). Proposed consistency tests are applied to Indoor WiFi localization system in order to investigate sources of inconsistencies and adjust parameters of the system in off-line mode. It is shown that underestimated measurement noise is the main reason of estimates’ inconsistency, however, considerably underestimated process noise or motion mis-modeling might also result in inconsistent and abnormal filter’s behavior.