We review techniques for sensor fusion in robot navigation, emphasizing algorithms for self-location. Tliese find use when the sensor suite of a mobile robot comprises several different sensors, some complementary and some redundant Integrating the sensor readings, the robot seeks to accomplish tasks such as constructing a map of its environment, locating itself in that map, and recognizing objects that should be avoided or sought. Our review describes integration techniques in firo categories: low-level fusion is used for direct integration of sensory data, resulting in parameter and state estimates; high-level fusiqn. is used for indirect integration of sensory data in hierarchical architectures, through command arbitration and integration of control signals suggested by different modules. The review provides an arsenal of tools for addressing this (rather ill-posed) problem in machine intelligence, including Kaiman filtering, rule-based techniques, behavior based algorithms, and approaches that borrow from information theory, Dempster-Shafer reasoning, fuzzy logic and neural networks. It points to several fiirther-research needs, including: robustness of decision rules; simultaneous consideration of self-location, motion planning, motion control and vehicle dynamics; the effect of sensor placement and attention focusing on sensor fiision; and adaptation of techniques from biological sensor fusion.
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering