Skip to main content

Integrated 3D Sensors suite

Article Category

Article available in the folowing languages:

Smarter sensors for autonomous space mission tasks

As robotic systems tackle more complex tasks in outer space, the sensors which provide the information to enable them to act autonomously need to be integrated for swifter feedback and more efficient control.


Repairs, resupply and refuelling of orbiting satellites and planetary rover exploration missions need near-human levels of capability and dexterity while operating autonomously. “Sometimes ground (control) helps, closer to Earth, where communication is easier. But in space there is a time delay so you need to make sure the system can do the mission planning and take decisions on its own,” explains I3DS project manager Ms Sabrina Andiappane, R&D Study Manager and Future Projects Engineer at Thales Alenia Space, in Cannes, France. Individual sensors used for navigation and other purposes are often produced commercially, usually with their own proprietary interface and software. The EU-funded I3DS project has integrated state-of-the-art sensors and detectors used in such missions to form a single suite and enhanced the software to improve real-time feedback and processing of the combined data to operate robotic systems. It includes data from visual sensors, such as high resolution, 3D and thermal infrared cameras; star trackers which orientate a vehicle using the stars; contact and tactile sensors that detect an obstacle; and remote sensors such as radar and lidar. It also includes illumination devices for use during eclipses or when there is no sunlight. A planetary rover, for example, needs all the sensors to make sure it’s going in the right direction and for obstacle avoidance. Modular system “These are smart sensors for smart missions, with pre-processing for the cameras, and the data is made easier to understand when it feeds into another system.” Sensors have to be synchronised, run at a certain frequency, and the algorithms must run quite fast in order to achieve real-time operability. “It was challenging to achieve this,” she says. The idea is to arrive at greater accuracy in comparison with any of the sensors acting in isolation. This involved specialists in data fusion and processing capabilities to specify, design, integrate and test the different architectures into whole robotic system architecture. The integrated platform is developed as a modular plug and play system. “There is a panel of sensors available and you can choose the ones you want. You can exchange one sensor with another, and it will still work. “What we were not able to fully integrate is radar which is subject to special safety regulations for testing. This meant it could not be carried out during the project lifecycle,” Andiappane explains. Control unit The instrument control unit, which interfaces with all the different sensors and sends the right commands in the same software environment, is a 233 x 160 mm motherboard. “We have proven that the system works but there are obviously needs to be more development to go in to space. For space missions all the electronics have to be radiation resistant,” she adds. The EU has funded several projects on space robotics in parallel with I3DS. As part of the follow-on European Robotic Orbital Support Services (EROSS) project Andiappane says: “We will take the different building blocks, for example the sensors, and the other frameworks such as navigation, and we will integrate them further to work in an even bigger (robotic) system to demonstrate in-orbit servicing.”


I3DS, EROSS, Space, satellite, sensors, robots

Discover other articles in the same domain of application

New products and technologies

6 January 2020

Scientific advances

28 December 2017