Files
Abstract
Intelligent robots have been increasingly used in unstructured and unknown environments rather than being limited in well-controlled settings. The key to successful autonomous robot operations in such environments is to combine robot manipulation and perception in a synergy, such that the perception guides robot motion, and the robot motion in turn enables better perception.This dissertation first addresses how to perceive an unknown object effectively and efficiently in an unknown environment through robot and object contact interaction. The novel approaches introduced efficiently generate continuum wraps around unknown objects based on contact interaction and use the resulting robot shape to capture the object shape information to achieve effective object classification, recognition and shape estimation. Additionally, experimental results also demonstrate that object classification can be achieved through simulation-to-real-world transferable learning.This dissertation further considers appearance-based object modeling in cluttered environments. Leveraging flexible continuum manipulation, an approach is introduced to plan robot motion that positions a tip camera at suitable spots around the target object to take RGBD images and register them to build and extend the object $3$D model progressively, while avoiding obstacles in unknown and cluttered environments. This dissertation also addresses how to achieve more flexible and autonomous robotic manipulation based on perception. A real-time adaptive motion planning approach is introduced to enable automatic conflict resolutions between task constraints and obstacle avoidance based on real-time visual sensing. More natural robot motion that seamlessly switches between task-constrained and non-task-constrained modes is achieved for improved motion adaptiveness in dynamically unknown environments.Last but not least, pose uncertainty reduction under complex contacts for fine manipulation is also investigated. A novel force forecast approach that relates real-world force sensing to a simulated world to enable pose uncertainty reduction is introduced. This approach does not require knowing contact locations or pre-define any contact types, and it can be directly applied to reduce pose uncertainty in real-world contact-rich assembly tasks.