Periodic Reporting for period 1 - VirtualGrasp (Speeding up the virtual reality revolution with realistic & real-time animation of hand-to-object interaction)
Reporting period: 2020-07-01 to 2021-12-31
Besides that this is an immense effort in terms of resources and costs for each new client and application, any update -- such as a change of the environment, the task or a simple piece of logic -- needs another involvement of the integrator to modify the system. Making the programming of robots much easier is thus a necessary requirement for a more rapid adaptation of robotic automation, especially for smaller industries without the resources to employ integrators. A way to achieve this is through automated or semi-automated solutions that can be implemented without deep expertise in programming or robotics.
There are many components involved in making robots smarter and easier to maintain (such as machine vision tools, robot navigation software, or language processing systems to give only a few examples). However, especially in the sectors of factory automation, industry 4.0 and the processing industry, a fundamental skill is grasping. The common process to make a robot interact with an object is for the programmer to pre-program a single repetitive task, such as fixing a door to a car, with expert programming skills. To make the robot do a different task, however, such as inspecting hundreds of different surgical instrument types, the same programmer would spend at least the same effort to program each single object, and potentially have to exchange a number of components in the whole system.
There is thus a demand for a flexible solution like VirtualGrasp that allows grasping of any object with any gripper in any application, and to further democratize programming of robotic systems to enable more people to use and deploy robots for their applications.
To supply our grasping system with knowledge about objects, we studied a number of commercial 3D camera solutions to find common ground on the needs of the proposed grasp planning solution, and contacted a number of market-leading 3D camera providers. In the project, a Unibap system was used to identify the type and location of an object in the vicinity of the robot.
The design of the “suction-pin” gripper was developed and continuously perfectionized based on the object shapes and the tasks of hanging and coating during the project. Any robot gripper that is described in Unified Robot Description Format (URDF) is now supported by VirtualGrasp.
We addressed the specific hole detection and surface detection for certain objects and their gripping options with customized grippers through VirtualGrasp’s object shape analysis algorithms. Grasp points are created where the suction-pin gripper finds a hole for the pin while also being able to contact a surface with the suction device. We evaluate all these points through grasp quality measures.
We identified that one of the most important needs for the user is on a technical level in terms of the solution being easy to integrate, test, maintain and extend. We therefore implemented a robot communication protocol for VirtualGrasp to send and receive data to / from the robot.
This protocol was integrated into a visualization tool in which grasps could be reviewed and selected by the programmer on a remote laptop, or directly in ABB’s RobotStudio as a plugin that allows direct integration of VirtualGrasp into the developer program.
By taking the part of the intelligent grasp planning module, VirtualGrasp was used to automate the grasp planning process, and capable of providing automated results for any new objects that can be fed to the system as 3D CAD models. Due to the nature of VirtualGrasp as a collection of algorithms that are aimed at understanding all the components of a grasping process on a semantic level (such as the shape of the object, the grasp capabilities of the gripper, or the application itself), it also enabled a shared understanding with the robot developer through a graphical user interface and a plugin integrated into the robot programming engine, RobotStudio by ABB.