In this article, visual data obtained by a binocular active vision system is integrated, together with ultrasonic range measurements, in the development of an obstacle detection and avoidance system based on a connectionist grid. This grid based framework allows the integration of two different sensing modalities (vision and sonar) in a way that incoming sensorial data can be mutually enhanced and validated. Each grid node maps a configuration in a discrete subset of the robot's configuration space. Detected obstacles result in sets of restricted configurations. Using the grid data, a potential field is interactively computed for each mapped configuration. This computation is done in real-time, during the robot's motion, so the potential field changes as new data is integrated into the grid. The authors also propose the implementation of some innovative sensing and control strategies, using the integrated sensorial information.
展开▼