Sensor fusion for mobile robot navigation

Moshe Kam, Xiaoxun Zhu, Paul Kalata

Research output: Contribution to journalArticlepeer-review

167 Scopus citations

Abstract

We review techniques for sensor fusion in robot navigation, emphasizing algorithms for self-location. Tliese find use when the sensor suite of a mobile robot comprises several different sensors, some complementary and some redundant Integrating the sensor readings, the robot seeks to accomplish tasks such as constructing a map of its environment, locating itself in that map, and recognizing objects that should be avoided or sought. Our review describes integration techniques in firo categories: low-level fusion is used for direct integration of sensory data, resulting in parameter and state estimates; high-level fusiqn. is used for indirect integration of sensory data in hierarchical architectures, through command arbitration and integration of control signals suggested by different modules. The review provides an arsenal of tools for addressing this (rather ill-posed) problem in machine intelligence, including Kaiman filtering, rule-based techniques, behavior based algorithms, and approaches that borrow from information theory, Dempster-Shafer reasoning, fuzzy logic and neural networks. It points to several fiirther-research needs, including: robustness of decision rules; simultaneous consideration of self-location, motion planning, motion control and vehicle dynamics; the effect of sensor placement and attention focusing on sensor fiision; and adaptation of techniques from biological sensor fusion.

Original languageEnglish (US)
Pages (from-to)108-119
Number of pages12
JournalProceedings of the IEEE
Volume85
Issue number1
DOIs
StatePublished - 1997
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Sensor fusion for mobile robot navigation'. Together they form a unique fingerprint.

Cite this