Skip to main content

Virtual Environment Interface by Sensory Integration for Inspection & Manipulation Control in Multifunctional Underwater Vehicles

Objective



Objectives and content

This proposal concerns the study and development of methodologies for optimising acoustical and optical sensors' functioning and integrating related data for the formation of an accurate virtual environment aimed at supporting navigation, inspection, and maintenance/repair tasks of a multifunctional remotely operated underwater vehicle (ROV).

Incorporation of new technologies into production system and safety and reliability of production systems are the two main programme areas faced in this proposal. In particular, the integration of accurate and robust sensors for monitoring and diagnostics of industrial installations, also in hostile environments, and computer-assisted inspection and repairing systems by 3D data management constitute the objectives covered by this proposal.
At present, the guidance and the inspection tasks performed by ROVs are not easy for several reasons. For instance, every kind of data acquired by a sensor device on-board a vehicle is controlled by an operator. This implies the use of more than one operator for the execution of a specific task The ROV navigation is made difficult by the lack of depth perception of the operator who utilises, typically, only optical cameras. In other words, the several sensors work separately and for different goals. As a consequence, to perform an effective operation, an ROV requires specialised crew, expensive training course and many hours of practice.

The integrated use of acoustical and optical cameras to reconstruct the actual scenario by a virtual environment improves the pilot comfort and then speeds up the operations, also reducing the number of operators, and potentially allows the ROV driving to non-expert operators who could face all possible situations even when data coming from a single sensor are not sufficient to make safe movements. Moreover, this sharply increases the capabilities of telerobotic tasks in that an operator can interact with the synthetic scenario, understanding the surrounding situation, navigating and simulating a specific task before actually performing it. Also, deskilling the operator by the use of better computer model based controller will reduce the cost of ROV operations.

In this context, this proposal aims at achieving mainly two specific targets:
- to prove the feasibility of a new high-performance sensing and processing system for underwater imaging, based on optical and acoustical sensors and in their intelligent and integrated use to provide precise two-(2D) and three-dimensional (3D) information;
- the development of a 3D virtual environment from sensorial data reproducing the observed scene, supporting an operator to monitor the actual situation and control the vehicle in all environmental conditions.

These objectives can be attained by: (a) using multisensorial imaging devices, i.e., optical and acoustical cameras, providing real-time data and appropriate methodologies (e.g., sensor integration, data fusion, computer vision approaches) to process and interpret such data; (b) using novel methods for building and updating the virtual environment at different resolution; (c) developing adequate modelling and simulation procedures to emulate typical and non-typical operations, starting from actual sensorial data, allowing an operator to carry out an effective teleoperated task.

Funding Scheme

CSC - Cost-sharing contracts

Coordinator

TECHNICAL SOFTWARE CONSULTANTS LTD
Address
6,Southwood Hall 7A, Muswell Hill Road, Highgate
N6 5UF London
United Kingdom

Participants (3)

OMNITECH AS
Norway
Address
12,Nedre Aastveit 12
5083 Ovre Ervik
UNIVERSITY OF GENOVA
Italy
Address
35,Via Dodecaneso 35
16146 Genova
Università degli Studi di Udine
Italy
Address
Viale Delle Scienze 206
33100 Udine