Skip to main content
European Commission logo print header

Smart UNattended airborne sensor Network for detection of vessels used for cross border crime and irregular entrY

Final Report Summary - SUNNY (Smart UNattended airborne sensor Network for detection of vessels used for cross border crime and irregular entrY)

Executive Summary:
The control of the EU border is one of the issues that the EU is facing. The length of the border partially on land and partially over sea poses a challenge to the organisation in charge of the border control. National authorities cooperate with transnational organisations such as the FRONTEX agency, but the areas to be controlled are very large and sometimes remote and the resources available to cope with such tasks are insufficient to achieve the desired levels of effectiveness.
The SUNNY project aimed to develop system solutions capable of improving the effectiveness of the EU border monitoring compared to the legacy systems whilst keeping affordability and interoperability as key enabling factors. It is recognised that the legacy sensors and communications system developed for military applications are not optimised for border monitoring and their interoperability with civil standards is limited. Moreover, it is acknowledged that the diffusion of the information is tailored in service systems to highly skilled personnel and the number of operators to conduct the activity is high. Via the integration of technologies developed across different initiatives, the SUNNY approach delivers pre-processed information with meaningful decision support tools enabling the reduction of number and required level of expertise of border surveillance personnel.
SUNNY represents a step beyond existing research projects due to the following main features:
A two-tier intelligent heterogeneous UAV sensor network was integrated in order to provide both large field and focused surveillance capabilities. In this network, the first-tier sensors, carried by MALE (Medium Altitude Long Endurance)) UAVs are used to patrol large border areas to detect suspicious targets and provide global situation awareness. Fed with the information collected by the first-tier sensors, the second-tier sensors are deployed to provide more focused surveillance capability by tracking the targets and collecting further evidence for more accurate target recognition and threat evaluation. Novel algorithms were developed to analyse the data collected by the sensors for robust and accurate target identification and event detection.
Novel sensors and on-board processing generation, integrated on UAV system. The focus was on developing and integrating sensors of low weight, low cost, high resolution, which can operate under variable conditions such as darkness, snow, and rain. In particular, SUNNY developed sensors that can generate both (Visible, Near Infrared (NIR)-SWIR) and LWIR images and hyperspectral data and that can use RGB data and radar information to detect, discriminate and track objects of interest inside complex environment, over sea. SUNNY also coupled sensor processing and preliminary detection results (on-board) with local UAV control, leading to innovative active sensing techniques, replacing low level sensor data communication by a higher abstraction level of information communication.

Project Context and Objectives:
SUNNY is an innovative concept for fully integrated development of Optimum Airborne Sensors with Adapted Data Link Solution(s) to Enhance Border Security, driven by real End User requirements and appropriate interoperability and standardizations. SUNNY addressed the following key topics:
- Innovative airborne active and passive sensors for civil security applications. Specific constraints for applicability in related scenarios were analysed and taken into account (weather, physical constraints, affordability, endurance, resolution, mission timing and characteristics, operating procedures).
- Adapted communications and data link, civil frequencies data link and protocols for airborne sensor network for combining large field and focused/tracking surveillance capabilities to be inter-connected and supervised by a ground control centre with large parallel processing capabilities.
- Sensor Fusion and Data exploitation. Novel fusion algorithms for an effective data processing, analysis, fusion and visualization focusing attention on techniques, and human computer interface for fusing and visualization of multi-source sensor data captured in real time for assist in global situation awareness and prompt decision making. Man-in-the-loop learning mechanism will also be addressed for learning from human feedback.
- Cooperative networking. Cooperative mission by optimized sensors network, teams of platforms complementing other assets (space and surface), data transformation into complete information to support decision makers.
- Improving the sensing and perception autonomy of UAVs by innovative on-board processing capabilities using state of the art of parallel processing hardware.
- Appropriate interoperability and Standardization will be addressed.
- Concept Validation through on-ground and in-flight test demonstration (primary site trials I Greece, NAMFI test range Crete, and alternative site in Portugal).
- Assessment of cost-effectiveness of the proposed border monitoring system via cost-benefit analysis.

SUNNY aimed to deal with the above mentioned challenges in accordance the layered concept presented in Figure 1.

Figure 1: SUNNY operational view

In the following for each layer we present the key scientific objectives and measurable outcomes:
1. In the Sensing layer the applicable objectives were:
a) Develop optimum airborne sensors for the purpose of maritime/terrestrial surveillance and situation awareness enhancing EU-border security (optics/optronics, radar and 3D) that are suitable for civil security applications. The focus is on developing and integrating sensors of low weight, low cost, high resolution, and can operate under variable conditions such as darkness, snow, and rain. In particular, SUNNY developed sensors that can generate Near Infrared (NIR) images and hyperspectral data and that can use radar information to detect, discriminate and track objects of interest inside complex environment, over land and sea. The sensors were integrated on UAV systems and tested in flight in relevant conditions.
b) Develop and implement on-board real-time data processing. It led to reduced communication requirements and taking advantage of the multi UAV configuration and mobility to perform active sensing. This processing was performed with current state-of the art low power parallel hardware computation systems and provided abstract information to be incorporated in the overall system leading to increased situational awareness. The on-board processing system was integrated on UAV and tested in flight.

2. In the Communication layer the applicable objectives were:
a) Tailor existing datalink technology and to develop adapted data link solutions. It enabled the use of civilian frequencies and protocols to support the operationally critical communications between UAVs and sensor network ground control centre. A reliable communication system was integrated to enable the communication between UAVs and between the UAVs and the ground operator. The communication system was validated in the final demonstration.

3. In the System layer the applicable objectives were:
a) Develop and integrate an intelligent airborne heterogeneous sensor network. The focus was on integrating various airborne sensors to form a two-tier intelligent heterogeneous network in order to provide both large field and focused surveillance capabilities. Effective path planning, collision avoidance, and coordination between UAVs were investigated in order to achieve more effective large field surveillance with low cost and energy consumption. The solution was tested in flight.
b) Develop and integrate capabilities to enable fast (near real-time) detection and tracking of people, vehicles and ships. The capability based on fast image processing algorithms enabled the detection and tracking of vessels ad from operational relevant distances. The focus was on novel and advanced algorithms for image segmentation, object recognition and detection. In particular, super-resolution and active learning techniques were investigated to enhance the image quality and reduce the number of false alarms. The capability was tested in flight.

4. In the Information layer the applicable objectives were:
a) Develop and implement effective data fusion and focus of attention techniques, and human-computer interface. It is about the fusion of multi-source sensor data. The focus is on development of advanced data fusion algorithms for effectively combining information collected from multiple sensors which could be noisy and contradictory to each other. In particular, different sensory data will have different resolutions. Novel algorithms are to be developed to align the data of different spectral to reconstruct high resolution images for all spectral. These algorithms were validated with datasets in the final demonstration.
b) Develop and implement effective human-machine interface. The focus was to assist in global situation awareness and prompt decision making. The developed human computer interface also enabled efficient human feedback so that the intelligent system can learn from the feedback for improving target/intrusion. This system was validated in simulation and the final demonstration.

SUNNY demonstrated the objectives mentioned above across two levels:
1) By conducting a System Demonstration which allowed the testing and validation of the proposed system. Real-life use case scenarios were foreseen for each of the project objectives. These use case scenarios were defined by the end-user partners of the consortium. The scenario chosen was for detecting drug smuggling operations at sea. This was achieved in collaboration with sea border control end user partners, who were able to deliver the required testing facilities.
2) By System simulation, aimed to support the development and check the robustness of the innovations under controlled conditions to reduce the risk associated with flight testing and to enable the extrapolation of the real life testing to a wider set of conditions than those covered by the in-flight system demonstrations.

Beyond the technical objectives defined above, the SUNNY project
1) Ensured that the proposed solution was compliant with all privacy, security and ethical aspects as well as European societal values and citizen’s right.
As the proposed SUNNY solution was heavily geared towards surveillance activities, it was of vital importance that the technology does not infringe upon citizen’s rights. For this purpose, the project put great emphasis on evaluating and monitoring legal and ethical aspects on privacy and societal values, leading to the potential acceptability of the system.
2) Promoted the solution for the adoption by civil agencies in order to ensure the pan-European uptake of the SUNNY system.
For the development of the optimum sensors and data links (including data fusion for enhanced situational awareness), the project also put special emphasis on critical factors which will determine the up-take and adoption of the solution by civil agencies. These factors include miniaturisation, affordability, endurance, special adherence to the timing and characteristics of mission and operating procedures. Furthermore, the project aimed to promote the proposed technology to playing a pivotal role in the contribution to law enforcement and immigration control. This will also reinforce the up-take of the proposed solution.

Project Results:
Visualisation and User Interface
The Visualisation and user interface system has been successfully completed as part of the SUNNY project. A number of components have been completed:
• A 2D map visualisation showing all the information from the SUNNY system in one place
• A 3D representation of the UAVs pose
• A full VR mission control room.

2D Map Visualisation
The 2D map visualisation shows all the assets and SUNNY information in one place for use on standard monitors in a mission control environment, the interface has been successfully developed and shows the following:
• A map of the operational area
• UAV positions
• Boat detections – with identification of threat
• Identified threats – with reasons
• Sensor information from the UAVs

Air traffic controllers were consulted and allowed to use the system as part of the testing and validation. As part of this several new features were added which were desired by the Air traffic controllers, these included:
• Circles of radius from mission control every 1 Nautical Mile
• The ability to change from metric to imperial
• The ability to see tracks of the last 5 minutes of the UAVs path
• Prediction of position in 1 minute in the future
• Floating information alongside each UAV for quick information understanding
• Normalisation of all UAV data to the same altitude frame.

The 3D Visualisations of the UAV pose has been completed. These show accurate real-time representation of the full UAVs pose, most notably it shows a quick representation of roll, pitch and yaw.

3D Render of the Ouranos 3D pose representation

Finally, a full VR control room has been completed within a brand new large scale VR environment. This environment includes a virtual table with 3D models of UAVS showing full information about the current state of the environment. It also shows virtualised screens given access to large amounts of information all within the VR environment without the need for a large control room environment.

VR Control Room VR Headset for control room

UAV-UAV communication
The hardware solution, composed of a IEEE802.11 transceiver at 700 MHz (DL749-78) to support air-air data link, the communications CPU (ALIX-3D2) to provide power supply and control interface and the low profile antenna at 700 MHz, was integrated into a light ABS housing.

Three different tests with DL749-78 transceivers were conducted to validate the communications module:
• laboratory tests
• ground-to-ground tests
• ground-to-UAV tests

UAV-Base station communication
A multi-hop scenario was investigated and simulated in ns-3 software. Such a scenario assumes that several UAVs are communicating with each other and with the SUNNU Base Station (SBS). Within that scenario, two routing protocols were tested, namely OLSR (Optimized Link State Routing) and AODV (Ad hoc On-demand Distance Vector). The protocols were evaluated through simulations and conclusions were extracted related to their performance. Moreover, topology control was investigated, that is how the number of relay nodes affects the throughput of the data sent by a certain node (UAV) to the SBS.
Moreover, hardware components were selected to build the communications module:
The SUNNY custom add-on board was designed to increase the functionality of the base-board by providing extra solutions for interface, power supply and generally accommodate efficient integration.
Finally, communication modules were integrated, tested and validated in an office and laboratory environment as well as in open field. These indicative results were fed to our Integration and Evaluation work where the Communication subsystem integration with UAVs was realised and real-time (flight) testing with on-board UAVs and SUNNY base station were performed.

SUNNY Base station
The base station has been fully integrated and tested as part of the SUNNY system. This allows for all of the information in the SUNNY system to come together in order to be processed and shown to users of the system.

This work involved an intensive amount of collaboration between partners to bring all the different components together into a single system. Firstly, internal integration of the SBS components was achieved to bring the entire ground system together in one place, the following diagram shows all the components that have been integrated:

Integration and testing was performed between the SUNNY Base Station (SBS) and the UAV GCSs to bring telemetry and UAV tracking data into the system, testing both on-the ground and in the air, was then carried out with each UAV to ensure correct transmission of all required data. After this integration was performed with the SUNNY sensor network to bring sensor data (such as images and radar) into the SBS. Again, this was thoroughly tested both on the ground and in the air to ensure successful transmission of SUNNY sensor data. Finally, all the data was brought together in the SUNNY visualization system to show successful and full system integration across the entire SUNNY System.

Finally, a portable version of the SBS was integrated into a single case with connections made available for the external components of the system as shown in the photos below.

Base Station

Short Wave Infra-Red (SWIR)

Functional diagram of XSW core

The figure above shows the functional diagram of the XSW: a 640x512 InGaAs based detector is used for SWIR imaging. This detector is cooled using a TE1 thermoelectric cooler. The ROIC of the detector is based on a CTIA which can be used in high gain and low gain mode. Furthermore the ROIC can be used in ITR (integrate then read) or IWR (integrate while read) mode.
The output of the detector (after ADC conversion) is further processed in the FPGA of the XSW core. The first step is the pixel correction to correct the fixed-pattern non-uniformity, photon-response non-uniformity and to replace bad pixels. In the second step, the resulting image is enhanced using the following algorithms:
• Auto exposure: this algorithm is used to find the optimal exposure time
• Auto gain and offset: this algorithm stretches and/or shifts the image histogram
• Histogram Equalization: this algorithm is used to enhance the contrast in the image
These algorithms are configurable by the user using the serial control interface: different parameters can be modified in order to obtain the most optimal performance. In default mode, all algorithms will be enabled.
In a final step the image is transmitted to the output PAL interface.
An overview of the XSW detector and sensor functionality specifications are given in the following tables.

S2.1 Parameter Specification Unit Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.1.01 Detector type InGaAs FPA; ROIC with CTIA topology (D)
S 2.1.02 Spectral Response 0.9-1.7 μm (T)
S 2.1.03 Number of pixels 640x512 (D)
S 2.1.04 Pixel pitch 20 μm (D)
S 2.1.05 Dark Current 0.8 x106 electrons/s (T)
S 2.1.06 Integration Capacitor
High Gain 2 fF (D)
S 2.1.07 Integration Capacitor
Low Gain 85 fF (D)
S 2.1.08 Pixel operability > 99 % (T)
S 2.1.09 Cooling TE1 (D)
XSW Detector Specifications

S2.2 Parameter Specification Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.2.01 Frame rate 25Hz (D)
S 2.2.02 Exposure time range 1-40000 µs (A)
S 2.2.03 Gain (in High Gain mode) 1.28 electron/ADU (D)
Gain (in Low Gain mode) 16.2 electron/ADU (D)
S 2.2.04 Core Read Noise Low Gain 400 electrons rms (T)
Core Read Noise High Gain 120 electrons rms (T)
S 2.2.05 Readout mode Integrate Then Read (ITR)
Integrate While Read (IWR) (D)
S 2.2.06 On-board image processing o Imaging correction (fixed NUC for XSW-GigE, TrueNUC for all others),
o Auto-Gain and Offset
o Auto-Exposure (except for XSW-GigE)
o Histogram Equalization
o Trigger possibilities (A)
XSW Sensor Specifications

The optical lens parameters are specified in the following table
S 2.6 Spectral Band Specifications Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.6.01 SWIR (0.9-1.7μm) o Focal length: 25 mm
o F-number: 1.8 (D)
XSW Optical lens specifications

From these specifications, it was decided to use the SWIRECON 25mm lens together with the XSW. The specifications of this lens are summarized in the following table.

Parameter Specification
Manufacturer OpticoElectron
Product Number SWIRECON 25
Focal Length 25.0 mm
Aperature-based f-number f/1.80
Waveband 0.6 – 1.7 µm
Assembly Weight 92.5g
Dimensions Diameter 32mm; length 49mm
SWIR SWIRECON 25mm Lens Specifications

The electrical interface specifications of the XSW are summarized in the following tables.

S 2.7 Sensor / Gimbal Video Data output specification Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.7.01 XSW-640 SWIR PAL
Connector: BNC (D)
XSW Video data output specification

S 2.8 Sensor / Gimbal Power Supply Power Consumption Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.8.01 XSW-640 (SWIR) 12 V ± 10%
Hirose HR10-7R-4SA(73) 3(*) - 10(**)
(*) without TEC
(**) TEC at 100% (T)
XSW Power consumption specification

S 2.9 Sensor / Gimbal Sensor Control Specifications Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.9.01 XSW-640 (SWIR) serial control RS-232
Connector: RJ-12 (A)
XSW Sensor Control specification

XSW Mechanical Specifications

The mechanical specifications of the XSW are summarized in the following table.

S 2.10 Mechanical size/weight Specification Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.10.01 XSW size 45x45x68.1 mm3 (XSW-640-PAL) (D)
SWIR lens size 25 mm lens: ≤50 mm(length) / ≤42 mm (diameter) (D)
Total size SWIR 45x45x110 mm3 (D)
S 2.10.02 XSW weight ≤150 g (XSW-640-PAL) (D)
SWIR lens weight 25 mm lens: ≤100 g (D)
Total weight SWIR ≤300 g (XSW + optical front + lens) (D)
XSW Mechanical specifications

Mechanical drawing of the XSW (without lens)

Mechanical drawing of the XSW (with lens)

XSW Environmental Specifications
The environmental specifications of the XSW are summarized in the following table.
S 2.11 Environmental Parameter Specification Verification method
S 2.11.01 XSW Operating case temperature -40 to 70°C (T)
S 2.11.02 XSW Storage temperature -45 to 85°C (T)
XSW Environmental specifications

LWIR Sensor design overview

Functional diagram of XTM core

The figure above shows the functional diagram of the XTM core. A 640*480 bolometer detector with 17 μm pitch from ULIS is used in the XTM. After ADC conversion the image data is further processed in the FPGA of the XTM. In a first step is the pixel correction to correct the fixed-pattern non-uniformity, photon-response non-uniformity and to replace bad pixels. In the second step, the resulting image is enhanced using the following algorithms:
• Auto gain and offset: this algorithm stretches and/or shifts the image histogram.
• XIE: a filter is applied in order to smooth or sharpen the image.
• Histogram Equalization: this algorithm is used to enhance the contrast in the image.
These algorithms are configurable by the user, using the serial control interface of the module. Different parameters can be modified in order to obtain the most optimal performance. In the default mode all algorithms are enabled.

An overview of the XTM detector and sensor functionality specifications are given in the following tables.

S2.3 Parameter Specifications Unit Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.3.01 Detector Type a-Si micro-bolometer (D)
S 2.3.02 Spectral Response 8-14 μm (D)
S 2.3.03 Number of pixels 640x480 (D)
S 2.3.04 Pixel pitch 17 μm (D)
S 2.3.05 NETD 50
(@ 30°C with F/1 lens (typical value)) mK (T)
S 2.3.06 Pixel Operability > 99 % (T)
S 2.3.07 Cooling No (D)
XTM Detector Specifications

S2.4 Parameter Specifications Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.4.01 Frame rate 25 Hz (D)
S 2.4.02 Shutter Yes (D)
S 2.4.03 Exposure time range 1 – 80 µs (A)
S 2.4.04 On-board image processing o NUC
o Auto-Gain and Offset
o XIE (sharpening or smoothing)
o Histogram Equalization (A)
XTM Sensor Specifications

The optical lens parameters are specified in in the following table

S 2.6 Spectral Band Optical lens Specifications Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.6.02 LWIR o Focal length: 25 mm
o F-number: 1.2 (D)

Optical lens specifications

From these specifications, it was decided to use the GASIR FL25mm lens together with the XTM. The specifications of this lens are summarized in the following table

Parameter Specification
Manufacturer Umicore
Product Number 12026_100
Focal Length 25.0 mm
Aperature-based f-number f/1.20
Waveband 8-12 μm
Assembly Weight 40g
Dimensions Diameter 31.5mm; length 24mm
LWIR Gasir FL25mm Lens Specifications

The electrical interface specifications of the XTM are summarized in the following tables

S 2.7 Sensor Video Data output specification Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.7.02 XTM-640 LWIR PAL (D)
XTM Video data output specification

S 2.8 Sensor Power Supply [V] Power Consumption [W] Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.8.02 XTM-640 (LWIR) 12 2.3 (T)
XTM Power consumption specification

S 2.9 Sensor Sensor Control Specifications Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.9.02 XTM-640 (LWIR) serial control RS-232 (A)
XTM Sensor Control specification

The mechanical specifications of the XSW are summarized in the following table.

S 2.10 Mechanical size/weight Specification Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.10.03 XTM size 45x45x44.6 mm3 (XTM-640-PAL) (D)
LWIR lens size 25 mm lens
e.g.: 24 mm(length) / 31.5mm (diameter) (D)
Total size LWIR 45x45x80 mm3 (D)
S 2.10.04 XTM weight 99 g (XTM-640-PAL) (D)
LWIR lens weight 25 mm lens: 40 g (D)
Total weight LWIR 149 g (XTM + optical front + lens) (D)
XTM Mechanical specifications

The mechanical drawings with and without lens are shown in the following figures.

Mechanical drawing of the XTM (without lens)

Mechanical drawing of the XTM (with lens)

The environmental specifications of the XTM are summarized in the following table
S 2.11 Environmental Parameter Specification Verification method
D = Design ; T=Test ;
A =Analysis or Demo
S 2.11.03 XTM Ambient operating temperature -40 to 60°C (T)
S 2.11.04 XTM Storage temperature -45 to 85°C (T)
XTM Environmental specifications

Hyper spectral
Three hyperspectral sensor options were presented and assessed in the specification phase. The main criteria for the hyperspectral sensor selection for the SUNNY demonstration mission were
• Type of targets to be detected, and
• The payload capacity of the UAV available to carry the hyperspectral sensor in the demonstration mission.
Two of the specified sensor options met the payload capacity. One covers the VNIR spectral region of 400-1000 nm and the second the eNIR region of 600-1640 nm. The development of both of these sensor systems has been conducted in the SUNNY project. The systems are similar in other respects except the VNIR system employs a CMOS detector array and the eNIR system a visInGaAs detector array. The optics of the sensors is also optimized for each spectral region, respectively.
As a result of successful development and testing, SPECIM is in a process of making both sensor systems commercially available. The VNIR system is commercialized as AisaKESTREL10 system and the eNIR system as AisaKESTREL16. The commercial names will be used also in this report when referring to these two systems.
The spectral coverage of KESTREL16 provides higher capability/probability to distinguish and identify specified targets of interest in the SUNNY mission scenarios, based on targets’ chemical characteristics. Particularly this refers to potential detection of various boat materials, like wood, plastics, rubber and metal based on their different spectral signatures in the KESTREL16 region. Therefore the AisaKESTREL16 system was chosen for the hyperspectral payload on the SUNNY Tier 2 UAV.
AisaKESTREL16 works in the push-broom mode. It acquires a line image on the target surface at a time in a way that full spectral data becomes collected for each pixel in the line exactly simultaneously. It requires constant movement of the camera during image acquisition. In a typical mission the push-broom imager makes an overpass flight across the area of interest, and collects the image data by looking straight down.

The following figure shows the diagram of the SUNNY hyperspectral system.

Hyperspectral system diagram

The onboard system consists of AisaKESTREL16 camera, GNSS/IMU sensor, data acquisition and control electronics unit (DPU), real time processing unit, and wireless link.

The GNSS/IMU sensor is mounted so that it precisely follows the movements of the hyperspectral camera.

The onboard data acquisition and control electronics unit (DPU) consists of the following components:
- Computer motherboard (PC),
- SSD on a swappable slot,
- Framelink Express frame grabber (Camera Link),
- SPECIM Control Board (SCB), and
- Power regulator.

CameraLink interface for data acquisition was chosen as it guarantees sufficient transfer capacity even with the maximum data rates from the camera.

Data is stored in a SSD which is the only hard drive option that provides the required speed, reliability and robustness. The SSD is exchangeable on a carriage slot. Storage capacity is 480 GB which is sufficient for approx 2.5 hours data collection with the highest image rate.

The SCB handles
• Synchronization between image data and GNSS/IMU data acquisition with 0.1 ms accuracy, and
• Shutter control in the hyperspectral camera.
The DC converter (regulator) has input range of 12 – 19 Vdc. It makes it possible to power the hyperspectral system from various (instable) sources, like a battery.
For the GNSS/IMU sensor, a new sensor xNAV500 from Oxford Technical Solutions in the UK was chosen. It is the first light weight (380 g) GNSS/IMU sensor in the commercial market that provides nearly the same high accuracy as previous sensors in the 2 to 3 kg range. With differential GPS service available, xNAV500 can provide 90 cm position accuracy from 1000 m altitude with a single antenna.

The payload DPU is connected to the onboard radio link through an Ethernet port.
The following SPECIM software is installed on the DPU:
• Lumo Recorder software to control the camera parameters and data acquisition and recording from the camera.
• DAQServer software is a connection manager software for the GroundStation application.
Each data capture session, like a flight line, creates a dataset in the system computer’s SSD. The dataset consists of Capture folder and Metadata folder. The raw image data is saved in a file in the capture folder in 16 bit BIL format (so called ENVI format), together with an ENVI header file. The header file contains information to read the image data file. The structure of the header file is described in the KESTREL system manual. GNSS/IMU data is collected in its own file in the Capture folder. No processing is done on the data before storage in the DPU.
In addition to the data acquisition software LUMO Recorder, which runs on the system DPU, SPECIM developed another software called GroundStation (GS) to operate, from a ground computer, the LUMO Recorder on the onboard DPU. GS operates with the onboard DPU through connection manager software, DAQServer, installed on the payload DPU. The ground computer on which the GS computer is installed shall have a network adapter port with TCP/IP protocol installed. GS is intended for operation and testing the hyperspectral system as a stand-alone payload. It does not have a role in the integrated SUNNY system, other than the same control functionalities are available in remote control dll, too.

The GS software features include:
• Control of camera parameters (integration time, frame rate, binning).
• Control of start and stop of data recording, either manually or automatically based on a pre-defined flight plan. Data recording starts typically within 0.1 s from the start command, worst case being not more than 1 s.
• Reception of diagnostic data from the camera system:
o Position and attitude data, GPS time and diagnostic information about the GNSS/IMU performance,
o Waterfall display image from Lumo Recorder software as a jpg image. Currently this image is not associated with positional data, but that feature can be added if required.
o Spectral profile in a single image pixel.
o Flight plan map, with defined start and end points for automated data recording, can be created under Google Earth and uploaded to GS. GS displays the progress of the flight and data collection based on the GPS position data received from the aircraft.

The payload system shall be powered before the launch of the aircraft. The payload DPU has been configured so that both the LUMO Recorder and DAQServer software automatically start at power-up and are ready to communicate with GroundStation.

The following tables summarize the optical, electrical, mechanical and environmental specifications of the AisaKESTREL16 hyperspectral system.
S 3.1 Parameter AisaKESTREL16
S 3.1.01 Spectral Range 600-1640nm
S 3.1.02 Spectral Sampling 2.75 nm
S 3.1.03 F/# F2.4
S 3.1.04 Smile/Keystone <0.25 / <0.5 pixels
S 3.1.05 Polarization sensitivity <±2%
S 3.1.06 SNR (peak) 800
S 3.1.07 Spatial resolution 640 pixels
S 3.1.08 Frame time Up to 100 Hz
S 3.1.09 Integration time Adjustable within frame period
S 3.1.10 FOV 40 degrees
S 3.1.11 Electro-mechanical shutter Yes
Optical Specifications

S 3.2 Parameter AisaKESTREL16
S 3.2.01 Data interface Cameralink 14-bit
S 3.2.02 Remote control interface Ethernet
S 3.2.03 Data streaming interface Ethernet
S 3.2.04 Input supply voltage range 12 – 19 V DC
Power consumptions
S 3.2.06 AisaKESTREL16 sensor max 46 W
S 3.2.07 DPU max 32 W
S 3.2.08 GNSS/IMU sensor max 6 W
S 3.2.10 Link max 1 W
Electrical Interface Specifications

Main Size & Weight specifications are given in the table below.
S 3.3 Hardware Push Broom Hyperspectral Camera GPS/IMU PC and Control Electronics (DPU)
Component AisaKESTREL16

S 3.3.01 Weight 2.3kg 0.4kg 2 kg with cables
S 3.3.02 Dimensions 99x215x240mm
with front lens 120x70x40mm 164x154x<101mm
S 3.3.03 Housing Al Al Al
S 3.3.04 Adjustments by user Front lens focus - -
Mechanical Specifications of AisaKESTREL16

Both the AisaKESTREL10 and AisaKESTREL16 system were designed according to the RTCA DO 160G standard, regarding the conditions specified in the following table
S 3.4 Conditions Section Description of Test Conducted Method
S 3.3.01 Low temperature 4.5.2 Category A4
Operating +5°C, ground survival -15°C Test Exception to the standard: operational low temperature
S 3.3.02 High temperature 4.5.4 Category A4
Operating +40°C, ground survival +40°C Test
S 3.3.03 Altitude 4.6.1 Category A4, 15 000ft/4600m Analysis
S 3.3.04 Overpressure 4.6.3 -15000 ft (170kPa) Analysis
S 3.3.05 Temperature variation 5.0 Category C Test
S 3.3.05 Humidity 6.0 Category A standard humidity Environment, MAX 95% relative humidity, non-condensing Analysis
S 3.3.05 Operational shock and Crash safety 7.0 Category B, operational: terminal peak saw tooth 6G/11ms, crash safety impulse: terminal peak saw tooth 20G/11ms, crash safety sustained: static load equivalent of 20G in every direction Test Item under the test does not have to survive in functional state in the crash hazard test. Dummy can be used in the crash hazard test.
S 3.3.05 Vibration 8.0 Category S fixed wing reciprocating less than 5700 kg (12500lbs) aircraft zone 1 (fuselage) vibration curve M, Category U2 helicopters (reciprocating and turbojet engines) aircraft zone 1, vibration curves F and F1 Test xNAV500 operational vibration condition is limited to 1 Grms.
Environmental Specifications of AisaKESTREL16

Radar System Realization And Installation
MetaSensing developed the MiniSAR system as a compact radar sensor that combines high quality radar performance and flexible system characteristics. It is the result of several years of experience by MetaSensing in creating and developing SAR system solution. The sensor operates at X-band and it can be mounted on a wide range of moving platforms, including the ANTEX, the one designed for the demonstration of the SUNNY concept. High Resolution Imaging (HRI) and Moving Target Indication (MTI) capabilities permit all-weather airborne observation and surveillance. The following figure shows a typical example of the high-resolution radar image acquired by MetaSensing with a previous version of its X-band airborne radar sensor.

Example of SAR image acquired by MetaSesning’s MiniSAR..

In the following diagram the black arrows indicate control and data connections, the blue arrows represent RF connections and the red lines show the DC power distribution lines.

The MiniSAR PCU is the brain of the radar system. It takes care of controlling and synchronizing the operation of all sub-systems, generating the waveform, processing the data, storing results and providing the outputs. The main constituents of the control unit are a processor computer, an arbitrary waveform generator (AWG) and analog-to-digital converter (ADC).

In order to withstand the payload specification of the ANTEX, between the two possible SAR PCU enclosure solutions, i.e. standard Vs ruggedized, it is chosen the lighter one. The main physical characteristics of the MiniSAR PCU are summarized in the following table.

Parameter Value
Weight 2.9 kg
Height 80 mm
Depth 250 mm
Width 200 mm
Physical characteristics of the standard MiniSAR PCU

The radar front-end (RFE) is the radio frequency module of the radar, taking care of transmitting and receiving microwave signals. It accepts the IF waveform as input from the MiniSAR PCU and it up-converts is to the desired operating frequency (9.6 GHz), before sending it to the connected RF antenna. The radar echoes received back by the antenna are then conveyed again to the RF unit for further elaboration with down-conversion: the IF signal is delivered again to the SAR control unit for digitization, eventual storage and mostly for target detection and image cropping processing.
A photograph of the actual RFE is displayed following figureand its physical specifications are given in following table It is to be noted that this module is a more powerful version (HPA of 5 Watt) with respect to the originally designed one (HPA of 2 Watt). A more powerful amplifier module allows for higher SNR, translating in improved detections of smaller targets also at longer ranges.

The actual MiniSAR RFE installed in the ANTEX

Parameter Value
Weight 4 kg
Height 65 mm
Depth 200 mm
Width 180 mm
Physical characteristics of the MiniSAR RFE for the SUNNY demonstrator. Dimensions do not include connectors and cabling

The GPS-IMU is used by the radar to derive the position of the sensor and the attitude of the aircraft, when emitting its waveforms. The MiniSAR sensor is equipped with its own GPS-IMU, shown in the following figure. The physical characteristics of the GPS-IMU are given in following table. A dedicated firmware interfaces the SAR processing and control unit with the navigation unit.

The actual MiniSAR navigation unit installed in the ANTEX

A GPS antenna is installed to the roof of the ANTEX, to receive GPS satellite signals. The one used by the MiniSAR is shown the following figure. It is an active antenna designed to operate at the GPS L1 and L2 frequencies, 1575.42 and 1227.60 MHz and across the L-Band from 1525 to1560 MHz. The GPS receiver provides the necessary power through the antenna RF connector. A coaxial cable with a male TNC connector is used. The antenna is aircraft certified for navigation.

The actual MiniSAR GPS antenna installed in the ANTEX

Parameter Value
Weight (including GPS antenna) 2.4 kg
Height 89 mm
Depth 127 mm
Width 106 mm
Physical characteristics of the MiniSAR GPS-IMU

The MiniSAR RF antenna is a flat panel based on microstrip array technology, which offers good electromagnetic performance with limited dimensions and weight. An image of the radar RF antenna is provided in the following figure. The physical characteristics and performance of the antenna are described in following table

The MiniSAR RF antenna installed in the ANTEX

Parameter Value
Frequency 9.5 – 9.7 GHz
Gain 26 dBi
Azimuth Half Power Beamwidth (HPBW) 4.4 deg
Elevation HPBW 10.6 deg
Sidelobe level -13 dB
Polarization V
Dimensions 450 x 225 x 15 mm
Connector N female
Weight 1.3 kg
Physical characteristics of the MiniSAR RFA

The pan-and-tilt unit (PTU), displayed in the following figure, is a device that corrects possible unwanted aircraft attitude variations, according to the navigation data coming from the GPS-IMU and elaborated in real time by the MiniSAR PCU. In this way, the desired area can be imaged with high precision and reliability, making sure that the illuminated scene is the desired one, and it is not affected by the instable motion of the mounting platform. Additionally, the PTU is the element by which the antenna scanning can be performed, and the area to be monitored can be chosen according to the user requirements.
The connector on the base of the PTU is used to power and control the unit. The connector halfway the height of the PTU, visible in the figure, is left unused.

The MiniSAR PTU with flat panel microstrip RFA mounted on it

The key parameters of the Pan-Tilt Unit are summarized in the following table

Parameter Value
Weight 4.6 kg
Height 200 mm
Depth 180 mm
Width 450 mm
Rotating radius 620 mm
Physical characteristics of the MiniSAR PTU.

Potential Impact:
The SUNNY project was carefully formulated to have widespread technical, social, economic and policy impacts to the EU:
• Technical: SUNNY developed groundbreaking sensor technology that can be interfaced with existing border control already employed widely within the EU. By using an array of multiple sensors on multiple UAV combined together, many of the limitations of the existing state-of-the-art border monitoring can be overcome. The performance of the system was quantified in real life scenarios by our end-users to give a firm base for future technical developments.
• Social: Border control is required for the correct functioning of the EU and indeed for its neighboring states. However, existing systems are ineffective due to the long lengths of the border and the fact that a large proportion of it is over the sea or in inhospitable terrain. The loss of life (over 17,000 since 1990) of immigrants attempting to enter the EU has a pronounced social cost to society. The EU employment market, especially in times of economic stress, needs to be seen to be fair to the EU citizen if the free market is to survive. Clearly this involves controls at borders to prevent illegal entry for work.
• Economic: Europe has significant industry working in the sensor and UAV markets. Indeed, security and area management offers a huge marketplace for European business. Indeed our industrial partners are all major EU world-players in these markets. However, increasingly sensor hardware is being provided by cheap imports from China and India. In order to compete, EU companies need a technology advantage. SUNNY provides higher ‘added-value’ to existing security systems, allowing the development of new multisensory products and giving European producers a substantial advantage in the security marketplace.
• Policy: In recent years, it is becoming increasingly clear to the European Council that Europe needed a transnational security policy to exist alongside national security policies, where the assets to be protected have an European importance. The technology that SUNNY developed will be a critical input in developing workable security solutions and timely to these key EU policy priorities.

The primary technical impacts of this project will be on UAVs and integrated sensor systems. However, there are significant spin-offs into other areas.

UAV Integrated Sensor Systems: This project has developed innovative and critical underpinning technologies that will take the European UAV civil application industry to the next level. The global UAV industry is currently dominated by the United States and is also increasingly lagging behind the emerging Asia-Pacific one. The development, integration, and demonstration of the proposed SUNNY system provide a timely boost for not only the industry partners of the consortium but also European companies beyond the consortium to compete with the rest of the world.

Because UAVs are not burdened with the physiological limitations of human pilots, they can be designed for maximized on-station times. Indeed solar cell powered UAVs have repeatedly set endurance records for heavier than air flight times.

The issue with the technology is not now the UAVs themselves, but with the autonomous operation of the aircraft since manual operation via radio links from the ground is not feasible with such flight times. This requires the availability of on-board sensors (radar, ir, visual) of the kind that SUNNY has developed. Furthermore, the autonomous data integration and decision software is ideal for guiding UAVs in their long-term mission.

Furthermore, this project introduces the use of commercial standards and digital broadcasting techniques improving the communication and coordination between several UAVs and the corresponding base stations. Another contribution was achieved by the development of an efficient antenna solution for this type of missions.

Spin-Off Technologies: Other technical advantages lie mainly in maturing technical know-how that has raised the Technology Readiness Levels (TRLs) of several and heterogeneous technologies, such as data analysis, on-board real-time data processing, video sensor network technologies, image recognition, data fusion, fast (near real-time) detection and tracking of vessels

Technology Impact in the Longer Term: SUNNY was developed as an open platform and a number of dissemination events were and will be run and papers in the scientific journals published to allow other researchers in the EU to be informed of these technology developments.
The outcome from this project will greatly assist in the detection, identification and interception of those attempting to enter the EU illegally. In particular, we see added value in the following areas of paramount importance to the EU:
1. Reduce the number of illegal immigrants who enter the EU border undetected and subsequently incur great social and economical cost
2. Prevent cross-border crimes such as terrorism, trafficking of human, drug smuggling, illicit arms trafficking

Unmanned Aerial Vehicles (UAVs) with airborne sensors will improve border security management reducing the need of tools that has not proven to be an effective solution, like human agents, border control systems, video cameras, ground sensors, physical barriers, land vehicles, or manned aircraft, while achieving a decrease in illegal incursions that causes tremendous social and economical problems.
The SUNNY project proposes innovative sensor solutions for airborne surveillance of maritime borders for the enhancement of EU-border security and defense against incursions and illegal migration. Furthermore, as part of WP1, the SUNNY project considered the impact border surveillance activities from societal and ethical considerations. These solutions will help the different agencies to share and combine information and resources, which in turn help them to act in concert rather than as independent organizations. At the same time, the information sharing and combination will help all border security teams to build a more complete picture of the situations from the fragments of data available to them. As a result, they will be better able to coordinate and distribute resources for border surveillance, increasing the efficiency and effectiveness of these operations. All of that translates into a more secure Europe.
Unmanned Aircraft System (UAS) has been primarily used up to now in the military domain, and this use has increased tremendously in the last decade. This project collaborate on the introduction of Unmanned Aircraft Vehicles in the civil domain, enabling improvement in effectiveness of civil applications like surveillance and monitoring of assets, currently performed by manned aircrafts and other methods with less effectiveness and at much higher costs.

The three largest markets for sensors for UAVs are expected to be; Visual Camera for viewing live footage, Electro-Optic/Infra-Red (EO/IR) for day and night surveillance and thermal imaging, and RADAR, usually two types; Synthetic Aperture Radar (SAR), for large area scans and/or motion/direction indication (MTI) RADAR. The evolution of UAV remote sensor requirements stems from the changes in military and homeland security missions. In the military market, new capabilities requirements are arising from changes in military doctrine. This includes shift to net-centric warfare (NCW), the rise of the propensity of Low intensity conflicts (LIC), the need for persistence, stand-off engagement and reduced tolerance for life losses. UAVs typically perform better in these environments and can deliver better than legacy manned assets. For example, conflicts in urban environments increase the need to minimise collateral damage and increase the need for positive identification to discern combatants from civilians. This has become a priority for forces, driving the need for UAVs equipped with better remote sensors, such as visual cameras and EO sensors.

On the civil side, increasing homeland security means an increased demand for maritime and coastal surveillance operations, driving the need for RADAR systems. Increased demand for border monitoring, driving the need for EO payloads both for medium altitude long endurance (MALE) as well as tactical UAVs. Synthetic Aperture RADAR (SAR) is one of the fastest changing market segments. SAR systems, that up until recently could only be mounted on aircraft such as the Boeing 707 are being miniaturised and installed on tactical UAVs (TUAVs). In fact, EADS has developed mini SARs weighing as little as 4kg. Nevertheless, SAR systems are still too expensive to be widely used in civilian applications and are therefore being referred to as ‘sensors for the rich’. SUNNY offers a way to drive down the costs of these systems. With its emphasis on the use of commercial standards and digital broadcasting techniques, SUNNY should enable these cost reductions to be achieved.

It is important to note that SUNNY sensor deliverables can also be useful in a lot of other domains apart from UAVs. These include:
• Advanced multi-band and hyperspectral systems and certainly compact and robust systems are extremely useful in:
o Remote sensing
o Vegetation and geological analysis (mining)
o Sorting application is food industry and recycling
• Image fusion systems can be used for:
o Critical infrastructure protection (gas terminals, power plants etc.)
o Disaster prevention and monitoring (forest, city fires, industrial plants)
o People and traffic monitoring
• Miniature cameras and SAR with low power and low weight are extremely useful as process monitor and in machine vision applications. Applications include:
o Autonomous manufacturing systems
o Automatic vehicle driving systems
o Security in-building
o flight planning, autonomous landing systems , etc.

Efficient dissemination is considered a fundamental activity of the project, since dissemination activities contribute to the success of the project in short and long term.
SUNNY dissemination activities were mainly focused on:
• Press releases
• Producing dissemination material
• Dissemination to academic and industry community
• Attendance at events where SUNNY goals, design principles and results were presented.
• Direct contacts with potentially interested (international and national) institutions
• Animation
• Video

The SUNNY Consortium, DG ENTR and DG Home, collaborated on a press release for the project
Amongst others the Press release was reported in the following 5 publications
1. Defense File
2. Cargo Security Intelligence, Volume 1, Number 1, March 2014 Homeland Security
3. Professional Security
4. GeoConnexion

The consortium has produced an information leaflet on the project which was reviewed by DG ENTR and DG Home, The leaflet was available to all members of the consortium and was distributed at events which they attended.

SUNNY has produced an official overview PowerPoint presentation to be used wherever the consortium presented the project. This presentation was continuously updated as the project progressed.
The Project has produced several conference and journal papers.
Early in the project an animation was produced to communicate the SUNNY concept.

During the final phase of the project a professiona project video was made to illustrate the results of our work.
One proof of the success of SUNNY is that the users of the results are planning to use the system in
real operations after the completion of the project. Furthermore, all participants aim to commercially exploit the project results.

● BMTs Defence Division will exploit the full SUNNY Base Station System, and BMT Reality Studios will directly exploit the VR aspects of the system to create a fully VR mission control room for multiple applications, including border security, asset management and environmental monitoring.

● ALTUS-LSA aims to exploit multi-layer UAV solutions in the border security management domain, as well as in other domains such as the detection and monitoring of environmental issues (we are already quite active in the Air Emissions Monitoring), precision agriculture (also active) and emergency/crisis management.

● Leonardo Aircraft will exploit the definition of the interfaces to be used for accessing the various actors/platforms involved, and use the simulation laboratory PC2Lab (Product Capability and Concept Lab) aimed to define and validate the concept of aeronautical systems, thereby creating services for capability demonstration.

● MetaSensing enhanced the MetaSAR-X sensor, making it ready for use by the governmental agencies responsible for maritime patrolling and surveillance, as well as for Search and Rescue activities. The sensor will also be promoted to UAV manufacturers, so that they can offer a sensor suite with their integrated platform to their clients.

● Xenics will exploit the thermal/infrared sensors developed in SUNNY.

● SPECIM developed the hyperspectral sensor system AisaKESTREL in two versions, AisaKESTREL10 and AisaKESTREL16. SPECIM is marketing and selling these to research institutes, universities and UAV suppliers worldwide. Approximately 10 systems have been delivered to customers before the completion of the SUNNY Project.

● INESC TEC is a research institute looking for industrial partners and new projects to exploit the SUNNY results: (i) A closed on-board data processing solution; (ii) A gimbal software tracking solution; (iii) Hyperspectral data processing software in real-time; (iv) A Robust Wireless Mesh Networking solution; (v) A Multi-hop routing solution; and (vi) Knowledge acquired through extensive hands-on experience acquired during the integration of the SUNNY communications solution in all UAV platforms

● Vitrociset developed an integrated capability linked to sensor payload, sensor processing and data fusion, situational awareness and decision-making. The company has engaged in projects using the SUNNY results to develop a “swarming” concept, thereby enabling the cooperative behaviour of a team of mini/micro UAVs for managing complex missions in high dynamic and unpredictable environments. These will be exploited both commercially and in new projects.

● CNIT developed the ISAR software for a better refocusing of moving targets that appear defocused in SAR images. This paves the way to a more efficient imaging of targets, thus leading to potential advantages in the field of target classification and recognition. Coast guards and other maritime border authorities, as well as First Responders, are target markets for these results.

● Technalia developed three pre-commercial products in SUNNY: (i) An enhanced SCRAB-II UAV platform; (ii) A communication system embedded into UAV platforms; and (iii) A gimbal for two cameras and a retraction system. Finalisation of the products will take place in 2018, with the first commercial contracts expected in 2019.

● TTI has developed a communications platform (hardware and software) orientated to work on UAVs, hence providing a highly robust solution. TTI’s commercial department has also actively developed networking and sales actions tailored to appeal to UAV manufacturers, UAV service providers and end-users with special communications requirements.

● NCSRD has primarily been involved in SUNNY in the telecommunication part of the project, specifically establishing communication among UAVs, as well as among UAVs and SUNNY Base Station. NCSRD, being an R&D institution, will exploit the knowledge gains in SUNNY to acquire new projects.

● CINAV is a potential user of the SUNNY System, with the project results demonstrating a huge potential to be integrated as a part of the existing legacy systems, and to improve the current operational system with an increased use of UAVs

● KEMEA’s ambition is to have the SBS (Sunny Base Station) prototypes up and running for a relevant period of time after the conclusion of the pilots, so that results obtained in a real operational environment can be reported back to relevant First Responder networks and the consortium via KEMEA.

List of Websites:

Luke Speller
Lead Senior Scientist
BMT Group Ltd.
1 Waldegrave Rd.
TW11 8LZ