Files
Abstract
Autonomous vehicles are increasingly used in many applications, such as elderly care, medical care, military, schools, and space exploration. In this research, we introduce the use of a framework for achieving the autonomy in a vehicle and describe the usage of sensor maps in managing the sensor space of a vehicle. Complete coverage planning is a new technique to achieve maximum coverage of a space during navigation. Path planning plays a key role in autonomous vehicle navigation and is achieved through several standard algorithms. In this research, a sensor fusion framework is formulated that utilizes the concept of complete coverage planning improvised it to include the feature of human interaction. Uncertainty during navigation is a common problem in robotic vehicle navigation and is neatly handled by including the human operator during vehicle’s traversal, thus forming a closed loop between the robot and the operator. The improved framework also causes the vehicle to visit the cells previously not covered because of the presence of an obstacle and hence provides the user with much detailed information on the terrain data, such as location of trees, locations of ravines, and their spread across the cells in the field. Robot cognition is achieved through the use of sensor map structures. These maps are very similar to the Penfield Maps or the Sensory Brain Maps, which are key concepts in human cognition. Sensor management and sensor data processing are handled through these maps. In this research, the two important implementations are the robot navigation in a space (i.e., a field) achieving maximum coverage with human interaction and the cognitive sensor model of the vehicle similar to the brain map. The framework incorporates the usage of signal, feature and decision fusion between the stages and allows the vehicle to handle all cases of obstacle presence and initiate user help only when the fusion stages identify an uncertain confidence level.