Abstract:
The problem of information fusion appears in many forms in vision. Tasks such as motion estimation, multimodal registration, tracking, and robot localization, often requi...Show MoreMetadata
Abstract:
The problem of information fusion appears in many forms in vision. Tasks such as motion estimation, multimodal registration, tracking, and robot localization, often require the synergy of estimates coming from multiple sources. Most of the fusion algorithms, however, assume a single source model and are not robust to outliers. If the data to be fused follow different underlying models, the traditional algorithms would produce poor estimates. We present in this paper a nonparametric approach to information fusion called variable-bandwidth density-based fusion (VBDF). The fusion estimator is computed as the location of the most significant mode of a density function, which takes into account the uncertainty of the estimates to be fused. A mode detection scheme is presented, which relies on variable-bandwidth mean shift computed at multiple scales. We show that the proposed estimator is consistent and conservative, while handling naturally outliers in the data and multiple source models. The new theory is tested for the task of multiple motion estimation. Numerous experiments validate the theory and provide very competitive results.
Published in: 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings.
Date of Conference: 18-20 June 2003
Date Added to IEEE Xplore: 15 July 2003
Print ISBN:0-7695-1900-8
Print ISSN: 1063-6919