Skip to main content

Visually guided robots using uncalibrated cameras

Objective

The goal of the project is to remove this calibration bottleneck in visually-guided robots, in order to perform industrial tasks with minimal modelling. The self-calibration techniques, which will be used, constitute the core of the project. They are based on the idea of extracting the required Euclidean information from a projective representation. Through VIGOR, it is expected to complete the mathematical background of the approach, to design the associated algorithms and to validate them on relevant industrial test-beds. Precisely, the project will demonstrate the feasibility of visual-based control loops, which include several uncalibrated cameras and robots with more than 5 joints in complex tool guidance tasks, such as inspection, grasping, and welding. Visual-based control is a key issue in many industrial robot applications, especially when accurate calibration is not possible because of continually changing parameters. In this open framework, the project focuses on two practical relevant areas.



Improving robot efficiency in advanced industrial automation requires the inclusion of sensory information at many levels, one of the most important is the on-line task control.

Range finders, proximity sensors or wrist force sensors can only address the sensing problem in a reliable way when the robot is already at its target. Visual sensors can be used on-line through the so-called visual servoing approach. This technique, which consists of realising an image-based feedback control loop is of increasing popularity. However, generally speaking, it is currently based on the use of a single camera and operates only on the premise that all the components are fully calibrated: camera, robot, and world. The related calibration procedures are tedious, non-flexible and expensive.

First, in the robot manufacturing industry, accurately calibrating the provided systems is presently unavoidable. This step is expensive and reduces the productivity of a robotic system in a non-negligible way. The use of a visual feedback as proposed in the project will relax this constraint, thus reducing the cost and enlarging the potential market of a robot system. The SYLVIA robot, manufactured by Sinters, will stand as a test plant for validating the industrial relevance of a no "calibration" approach through a typical application of this robot: the inspection of an aircraft cockpit. The second area concerned by the project is the shipbuilding industry. Today, the robot-based shipbuilding technology is limited in that robots are exclusively programmed off-line. When in the actual environment, the CAD-generated trajectories have to be computed depending on the exact locations and orientations of all the relevant parts in the workspace, for example, to perform a welding task on the hull. With the proposed approach the relevant trajectories of the robot would be automatically transformed on line using the visual input. Odense Steel Shipyard, a partner of the project, thinks that VIGOR can significantly contribute to overcome some of the major difficulties in robot-aided shipbuilding.

Funding Scheme

CSC - Cost-sharing contracts

Coordinator

INSTITUT NATIONAL DE RECHERCHE EN INFORMATIQUE ET EN AUTOMATIQUE
Address
Domaine De Voluceau, Rocquencourt
BP 105 Le Chesnay
France

Participants (5)

FRAUNHOFER IAF
Germany
Address
Tullastr. 72
79108 München
Odense Steel Shipyard
Denmark
Address
P.o. Box 176
5100 Odense C
Societe D'integration des Etudes et des Recherches de Systemes Sa
France
Address
Rue Paul Mesple 5
31100 Toulouse
THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
United Kingdom
Address
Trinity Lane, The Old Schools,
CB2 1TN Cambridge
The Hebrew University of Jerusalem
Israel
Address
Authority For Research & Development
91904 Jerusalem