Multi-view object pose distribution tracking
The ability to track the 6D pose distribution of an object when a mobile manipulator robot is still approaching the object can enable the robot to pre-plan grasps that combine base and arm motion. However, tracking a 6D object pose distribution from a distance can be challenging due to the limited view of the robot camera. In this work, we present a framework that fuses observations from external stationary cameras with a moving robot camera and sequentially tracks it in time to enable 6D object pose distribution tracking from a distance. We model the object pose posterior as a multi-modal distribution which results in a better performance against uncertainties introduced by large camera-object distance, occlusions and object geometry.
See our ICRA 2022 paper for more details.
This research was supported by the Innovation Fund Denmark in the context of the FacilityCobot project.
More qualitative results on Multi-View YCB object pose tracking dataset for Mobile Manipulation