Loading…
Lower bounds for the divergence of orientational estimators
This paper is concerned with the properties of estimators in O(n,p),the n/spl times/p orthogonal matrices. It is shown that it is natural to introduce the notion of a parallel estimator where the expected value of the estimator must lie in normal space (orthogonal complement of tangent space) of O(n...
Saved in:
Published in: | IEEE transactions on information theory 2001-09, Vol.47 (6), p.2490-2504 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper is concerned with the properties of estimators in O(n,p),the n/spl times/p orthogonal matrices. It is shown that it is natural to introduce the notion of a parallel estimator where the expected value of the estimator must lie in normal space (orthogonal complement of tangent space) of O(n,p) at the true value. An appropriate measure of variance, referred to as divergence, is introduced for a parallel estimator and a Cramer-Rao (CR) type bound is then established for the divergence. The well-known Fisher-von Mises matrix distribution is often used to model random behavior on O(n,p) and depends on parameters /spl Theta//spl isin/O(n,p) and H a p/spl times/p symmetric matrix. The bound for this distribution is calculated for the case n=p=3 and the divergence of the maximum-likelihood estimator (MLE) of /spl Theta/ is estimated by simulation. The bound is shown to be tight over a wide range of H. |
---|---|
ISSN: | 0018-9448 1557-9654 |
DOI: | 10.1109/18.945260 |