Skip to main content

Integrated 3D Sensors suite

Periodic Reporting for period 1 - I3DS (Integrated 3D Sensors suite)

Reporting period: 2016-11-01 to 2019-01-31

Space robotic systems are increasing in complexity and versatility to tackle more advanced tasks in orbit. Using robots aims at reducing the costs by avoiding the design of life support systems, but their level of autonomy must be constantly improved to reach the human skills and dexterity. They will slowly replace astronauts for future missions for tasks performed in both orbital and planetary scenarios. To that end, the sensing capabilities of the robotic systems are critical to provide suitable and accurate measurements to the control algorithms.

Therefore, the Integrated 3D Sensors (I3DS) project aims at providing future space missions with a standardized suite of sensors in terms of electrical, thermal, and mechanical interfaces with the core vehicle.
I3DS project intends to develop a modular inspector sensors suite (INSES) which will be a smart collection of building blocks and a common set of various sensors.

The sensors suite proposed for I3DS has been selected from a wide range of sensors suitable for diverse scenarios and subsets are to be defined according to the application whether it will be for space rendezvous (cooperative or non-cooperative) or rover exploration.

The main objectives of I3DS are the following:
1. Design and Develop a sensor suite to be used for both planetary and orbital mission and demonstration to answer the needs of near-future space exploration missions
2. Integrate into a harmonised and modular suite of sensors with a common interface
3. Perform in a representative environment the testing of the sensor suite to validate their performances for both orbital and planetary use-cases
"I3DS project aimed at designing and developing an inspector sensors suite with integrated pre-processing and data concentration functions.

The sensor items in I3DS are:
• The star tracker that finds the orientation and location of the vessel using the stars.
• The time of flight (TOF) camera that captures depth images to generate 3D point clouds.
• The stereo camera that delivers two synchronized image streams that can be processed into a disparity map and 3D point clouds.
• The high-resolution camera that delivers a monochrome image stream.
• The thermal infra-red (TIR) camera that delivers thermal image stream.
• The force/torque sensors and contact/tactile sensors that delivers the contact information with the target used in the final docking phase of a rendezvous.
• The light detection and ranging (LIDAR) that delivers distance measurements to be used to generate 3D point clouds.
• The radar that is used for ranging measurements.
• The inertial measurement unit (IMU) that keeps track of the systems rotational and spatial acceleration, and feeds data into a Kalman filter to keep track of the systems state.

For illumination items the projected pattern illumination can be used with the high-resolution camera and a pre-processing algorithm to create 3D point clouds. The wide-angle torch provides general illumination when needed for both the high-resolution and stereo cameras.
The ICU contains the networking equipment for connecting devices and high-performance MPSoC and FPGA for control of the devices, processing of data streams and interfacing with the OBC.
The software components of the system are pre-processing of imaging streams, the sensor interfaces for controlling and accessing the sensors, the system interface for receiving commands and sending data to the OBC, and the real-time operating system.

One of I3DS main function is to produce pre-processed sensor data to the system in order to improve and facilitate the system sensor data processing.
The pre-processing goal was to transform the raw sensor data into a form that is suitable for use in typical navigation data fusion algorithms. The following pre-processing algorithms are proposed in for I3DS sensors:
1. Lens vignetting correction
2. Optical distortion correction
3. Bilateral Filtering
4. Histogram equalisation
5. Stereo rectification
6. Structured light pattern detection

The I3DS sensors have been validated trough orbital and planetary tracks and here are the sensors performances obtained are summarised in the attached table.

The dissemination of I3DS has been done through several means:

1. Project Website: http://i3ds-h2020.eu/
2. Project web articles and social media on partners company website
https://www.thalesgroup.com/en/worldwide/space/news/sabrina-andiappane-engineer-thales-alenia-space-named-most-promising-newcomer-0
https://www.sintef.no/en/projects/i3ds/
http://piap.pl/badanie/i3ds/
https://www.hertzsystems.com/firma-hertz-systems-wygrala-konkurs-na-zintegrowanie-sensorow-3d/#

3. Participation to several conference and company R&D events to promote I3DS
4. Press tours were done at TASF Cannes facilities where I3DS project was presented on the robotic test bench facility
5. Scientific paper publications at European conferences such as I-Sairas, IAC, DASIA, ADCSS and Ada Europe
6. I3DS communication tool kit with kakemono, posters, flyers and a video

"
The sensors performances have been assessed through open loop tests for both orbital and planetary application.

I3DS brings key generic building blocks to be instantiated per mission (rendez-vous, servicing, robotic exploration & science).
I3DS initiated a strong rationalisation of robotics building blocks for future missions based on a comprehensive system analysis, units & software development, then integration and tests up to TRL4. This stream has to be continued with additional system analysis to refine the system requirements to enable further TRL increase of key technologies, bring such technologies to higher maturity, and, as mid-term:
1/ Enable early in-orbit demonstration of new robotics building blocks
2/ Deploy I3DS technologies on high-end ESA missions

In Conclusion, the I3DS product consists in:

1. Sensors
• that provides High variability per mission type
• with a need for characterisation of typical performance range per sensor type (done in I3DS & other studies)
• with a need for TRL increase based on actual mission targets (TDA, GSTP/ARTES, mission development)

2. Processing & GNC framework
• with high genericity of building blocks
• that has a massive reuse in newest OG calls
• with a need for global TRL increase & in-orbit demonstration including a set of navigation sensors
I3Ds flyer back
I3DS flyer front