Skip to main content

Speeding up the virtual reality revolution with realistic & real-time animation of hand-to-object interaction

Periodic Reporting for period 1 - VirtualGrasp (Speeding up the virtual reality revolution with realistic & real-time animation of hand-to-object interaction)

Reporting period: 2020-07-01 to 2021-12-31

Among the top five robotics use cases in 2020, IDC estimated a strong need for robotic automation in the areas of assembly, pick and place, painting, welding and other tasks. To be able to bring robotic automation into these industrial applications, experienced programmers and trained experts use their special domain knowledge to set up the whole system for their clients, including the choice of robot hardware, software and on-site installation.

Besides that this is an immense effort in terms of resources and costs for each new client and application, any update -- such as a change of the environment, the task or a simple piece of logic -- needs another involvement of the integrator to modify the system. Making the programming of robots much easier is thus a necessary requirement for a more rapid adaptation of robotic automation, especially for smaller industries without the resources to employ integrators. A way to achieve this is through automated or semi-automated solutions that can be implemented without deep expertise in programming or robotics.

There are many components involved in making robots smarter and easier to maintain (such as machine vision tools, robot navigation software, or language processing systems to give only a few examples). However, especially in the sectors of factory automation, industry 4.0 and the processing industry, a fundamental skill is grasping. The common process to make a robot interact with an object is for the programmer to pre-program a single repetitive task, such as fixing a door to a car, with expert programming skills. To make the robot do a different task, however, such as inspecting hundreds of different surgical instrument types, the same programmer would spend at least the same effort to program each single object, and potentially have to exchange a number of components in the whole system.

There is thus a demand for a flexible solution like VirtualGrasp that allows grasping of any object with any gripper in any application, and to further democratize programming of robotic systems to enable more people to use and deploy robots for their applications.
We participated in a project from the Swedish coating industry to have a proper and real application at hand to study the value and demands of the VirtualGrasp solution in the robotics domain. In parallel, major suppliers and integrators of robotics solutions were contacted to provide their value estimation.

To supply our grasping system with knowledge about objects, we studied a number of commercial 3D camera solutions to find common ground on the needs of the proposed grasp planning solution, and contacted a number of market-leading 3D camera providers. In the project, a Unibap system was used to identify the type and location of an object in the vicinity of the robot.

The design of the “suction-pin” gripper was developed and continuously perfectionized based on the object shapes and the tasks of hanging and coating during the project. Any robot gripper that is described in Unified Robot Description Format (URDF) is now supported by VirtualGrasp.

We addressed the specific hole detection and surface detection for certain objects and their gripping options with customized grippers through VirtualGrasp’s object shape analysis algorithms. Grasp points are created where the suction-pin gripper finds a hole for the pin while also being able to contact a surface with the suction device. We evaluate all these points through grasp quality measures.

We identified that one of the most important needs for the user is on a technical level in terms of the solution being easy to integrate, test, maintain and extend. We therefore implemented a robot communication protocol for VirtualGrasp to send and receive data to / from the robot.

This protocol was integrated into a visualization tool in which grasps could be reviewed and selected by the programmer on a remote laptop, or directly in ABB’s RobotStudio as a plugin that allows direct integration of VirtualGrasp into the developer program.
By accessing Gleechi’s VirtualGrasp software and adjusting it for the domain-specific demands and challenges, we were able to show that it is possible to enable and modify automatic robot grasping interactions in a quick and efficient manner. To show the feasibility of VirtualGrasp as a robot grasp planning module, we have selected a proof-of-concept project from the Swedish coating industry. The project was thus performed in a real-life setting, collaborating with several actors in the Swedish robotics industry. The main purpose of the setting was to analyze the possibilities robotic automation provides to release factory workers from heavy lifting and repetitive work, decreasing the risk of work injuries, optimizing the workflow and minimizing costs.

By taking the part of the intelligent grasp planning module, VirtualGrasp was used to automate the grasp planning process, and capable of providing automated results for any new objects that can be fed to the system as 3D CAD models. Due to the nature of VirtualGrasp as a collection of algorithms that are aimed at understanding all the components of a grasping process on a semantic level (such as the shape of the object, the grasp capabilities of the gripper, or the application itself), it also enabled a shared understanding with the robot developer through a graphical user interface and a plugin integrated into the robot programming engine, RobotStudio by ABB.
A screenshot showing the plugin integration view of VirtualGrasp into ABB RobotStudio.
Another proper grasp produced with the customized gripper, also visualized in the VirtualGrasp visua
One of many proper grasp produced with the customized gripper, also visualized in the VirtualGrasp v
Robot programming environments combine physical robots with digital programming tools.
RoboGrasp Studio by Gleechi provides robot programmers with grasp-related features such as reviewing