Files
Abstract
Autonomous robots are being designed and used in a wide variety of environments and unforeseen ways. Intelligent interpretation of sensor data is the key factor in a robot’s decision making ability while operating in such environments. Current day robots are also expected to adapt, have human interaction and accomplish many more tasks, than ever before. This work focuses on developing a sensor fusion framework and achieving human robot interaction for a robot navigating an unknown terrain.The selective sensor fusion framework proposed in this work, combines heterogeneous data from sensors to analyze the parameters of the environment required to accomplish a task. It is scalable, modular and accounts for human interaction. The framework uses fusion algorithms and develops confidence at various stages before arriving at a decision. The framework is simulated and tested on a robotic vehicle called the TurtleBot along with additional LIDAR (Light detection and ranging) sensors on a simulated, unknown terrain. The vehicle navigates successfully through the terrain, avoiding obstacles and terrain irregularities while querying the user for assistance when it enters an uncertain mode. The framework is designed to accommodate higher levels of decision making before deciding on an action. Furthermore, the framework can be customized based on the application.