Periodic Reporting for period 1 - MANiBOT (Advancing the physical intelligence and performance of roBOTs towards human-like bi-manual objects MANipulation)
Reporting period: 2023-11-01 to 2024-10-31
The MANiBOT project aims to research and develop a bimanual mobile robotic platform able to address the aforementioned challenges through advanced perception, control and cognition methods and novel mechatronics. New multimodal environment understanding and object/pose recognition methods are developed based on real-time adaptive fusion of vision, proximity and tactile sensing, enabling fast, safe and efficient manipulation of diverse objects with various sizes, shapes, material and rigidity, which are not a-priori known or modelled beforehand, in diverse human populated environments. The object handling is performed through the use of a novel suite of manipulation primitives along with bimanual manipulation aiming to achieve performance close to that of humans even under significant spatial constraints. Through non-prehensile primitives (e.g. push, pitch, drag), which utilize supporting surfaces from the environment, the manipulation of heavy or cluttered items is enabled in an energy efficient way. An advanced multi-level robot cycle framework for cognitive functions orchestrates the above sensing and actuation methods towards efficient and trustworthy behavior that allows learning, composing and swiftly adapting for complex manipulations. The innovative methods of MANiBOT are coupled with novel cognitive mechatronics based on tactile and proximity sensors, integrated within state of the art bimanual mobile manipulation robot, optimized for energy efficiency and increased autonomy, including HRI capabilities for trustworthy and efficient operation.
The above capabilities of the MANiBOT system could have tremendous impact in major sectors of industry and services, from logistics and transport to retail, agri-food and manufacturing, where the use of MANiBOT technologies could give added value and provide a drastic boost in robot utilization in such sectors. MANiBOT’s use cases are indicative of the wide range of areas of potential use of the proposed technologies, since they entail very diverse handling tasks, i.e. shelves’ restocking in supermarkets and baggage handling in airports.
For more information on MANiBOT project, please visit: https://manibot-project.eu/(opens in new window)
Requirements and specifications: The SoA analysis was performed. The use cases were analysed and established and the MANiBOT’s architecture was created.
Robot perception: Initial methods on object detection and pose estimation were developed. Deep learning networks for the extraction of manipulation affordances were explored and promising enhancements were made, while initial experiments on cluttered environment understanding showed that it outperformed current SoA methods. Transfer learning was implemented towards accurate tactile inference and a novel deep learning method was developed for human detection using proximity sensors. Efforts to prove that federated learned models can be thoroughly explainable have started.
Control and navigation: Initial human aware navigation methods of high accuracy have been established. The online generation of joint reference velocities for the execution of prioritized bimanual tasks by a mobile bimanual robot was explored. Focus was put on control design for synchronized reaching and establishing of initial contacts and reinforcement learning of a sequence of pushing motions for grasping from clutter. The coupling control module was developed and tested in simulation.
Robot cognition and HRI: Investigation of semantic scene-graph representations for task planning, studies on language-driven affordance extraction, and the implementation of a task graph learning approach from human demonstrations were performed. Additionally, scientific contributions were made on active perceptive mobile manipulation and learning affordances in cluttered 3D scenes. Mock-ups for the HRI were designed and the initial development of the user interface has started.
Mechatronics: Preliminary designs of the bimanual mobile robotic platform have been performed taking into account the MANiBOT use case requirements and technical specifications. Initial development and testing of the proximity and tactile sensors have been performed. The integration plan has been established while the infrastructure for the iterative software integration based on Git environment has been set.
Testing and validation: An initial testing and demonstration plan has been established while the preparation for the lab environment set up has begun taking into account the pilot sites specifications.
• Literature survey of SoA relevant to MANiBOT. Identification of user requirements and extraction of the detailed technical specifications and overall architecture.
• Development of novel vision based methods on object detection and environment understanding. The methods require further improvements and testing, but initial experiments showed their capability to address highly cluttered environments and the lack of the objetcs' model, indicating the great potential impact they could have in several robotic applications.
• Development of human detection method based on proximity sensors towards increased safety in HRI.
• Development of transfer learning methods towards improved tactile inference accuracy.
• Development of initial human-aware navigation methods with high accuracy with high potential impact for the uptake of robots in human populated environments.
• Development of novel methods on bimanual manipulation and the use of non-prehensile manipulation primitives, focusing on object rotation and the use of pushing motions for grasping from clutter.
• Development of robot cognition methods on active perceptive mobile manipulation and learning affordances in cluttered 3D scenes towards autonomous robot operation in dynamic environments.
• Design of novel mechatronics. Novel high-resolution tactile sensor using optical fibres is under development. The first scalable proximity module prototype has been developed.