Skip to main content
European Commission logo print header

Smart fisheries technologies for an efficient, compliant and environmentally friendly fishing sector

Periodic Reporting for period 3 - SMARTFISH (Smart fisheries technologies for an efficient, compliant and environmentally friendly fishing sector)

Reporting period: 2021-01-01 to 2021-12-31

The goal of the SMARTFISH project is to develop, test and promote a number of high-tech systems that will optimize resource efficiency, improve automatic data collection, provide evidence of compliance with fishery regulations and reduce the ecological impact of the sector on the marine environment. SMARTFISH exploits and further develops existing technological innovations in machine vision, camera technology, data processing, machine learning, artificial intelligence, big data analysis, smartphones/tablets, LED technology, acoustics and ROV technology.

The SMARTFISH systems, once fully implemented in the fishing sector and taken up by the industry will 1) Assist commercial fishers in making informed decisions during pre-catch, catching, and post-catch phases of the extraction process; 2) Provide new data for stock assessment from commercial fishing and improve the quality and quantity of data that comes from traditional assessment surveys; and 3) Permit the automatic collection of catch data to ensure compliance with fisheries management systems.
We developed a system for pre-catch size and species recognition for purse seine fisheries - based on optical and hydroacoustic technologies. We tested and identified the most suitable sensor types for the sensor system “SeinePreCog", and developed the housing for it. We also completed the development of acoustics algorithms for fish size estimation and species recognition, and tested the performance of a 3D camera for fish size estimation named “UTOFIA”. We also completed testing of a size discrimination algorithm for anchovy by including acoustic and biological data from fishing trawls that targeted anchovy of large sizes. By doing this, were able to fit an expanded targeted strength versus length relationship for a range of anchovy sizes from 4 cm to up to 16 cm.
This was followed up by the development of the first prototype of a cable based real-time camera system, dry tested it, and prepared it for onboard testing. We adapted the UTOFIA 3D camera, which had been performance tested in our work on the pre-catch size and species system, so that it could also fit on a trawl. We also developed and tested software to view and analyze the data collected by this cable, independent of the 3D embedded smart camera. In combination with this, we developed and tested an operational concept we named “FishFinder”, which delivered high quality images even in turbid water and could document both nephrops burrows and Norway lobster, and did final tests in a series of both on-land and at-sea experiments on the cable-based 2D real-time monitoring system (RTM) we had named “TrawlMonitor”. Finally, the underwater footage from the Nephrops scanner was processed using photogrammetry, to provide a 3D reconstruction of the seabed. This reconstruction is orthographically projected to provide a digital elevation map and colour map in the same coordinate system.
Another line of work focused on the development of a system that uses LED technology to optimise the catching performance of trawl fishing gear. Based on the reaction of fish to light, we integrated a programmable LED light pod with an acoustic modem. This is an important development, since we had determined in previous laboratory experiments that it was possible to change fish behaviour with the use of artificial light and had completed sea trials that demonstrated this in a trawl gear as well. With this new addition, it was possible to control the light settings in real time from the wheelhouse. The resulting system has since tested successfully at sea.
We also continued our work on the 3D machine vision system for catch analysis on on-board conveyor belts – the “CatchScanner”. This had an early code for weight estimation and species identification only for a few species at that time, which has been further implemented and tested to include species and weight estimation alogrithms. The “CatchSnap", a versatile, handheld 3D machine vision unit for inspecting catch samples on smaller fishing vessels, has been further developed with imaging procedures and sampling methodologies. Finally, the “CatchMonitor” – a system for automatic monitoring and analysis using CCTV cameras, used on larger vessels – had an early prototype with early code for species identification, which has since been upgraded and tested for segmentation, fish species and count estimation algorithms with more datasets.
Finally, we continued our work on “FishData” - a hardware and software infrastructure for acquisition, analysis and presentation of data from onboard catch monitoring systems and other relevant data sources. We had defined the scope, requirements, and architecture of the systems, including several components of the onboard and onshore infrastructures, in particular modules for secure acquisition, transfer, pre-processing and storage of data from fishing vessels, and developed the prototype onboard and onshore hardware and software infrastructures. We then developed a prototype fisheries information system built upon this system, to demonstrate the utility and potential use cases of the infrastructure. The system, which is in the form of a web portal, provides information in both visual and programmatic form about catch efficiency and catch composition in fisheries, as well as forecasts of marine environmental conditions.

Each of the systems are tested, demonstrated and promoted in at least one regional sea and within appropriate commercial fisheries and systems. For the technologies being tested in the Norwegian and Barents seas, there was a lot of planning, travelling, engineering, mechanical and electrical work, data collection, testing, troubleshooting and reporting done. The testing in the Mediterranean and Black seas has made logistic arrangements for the test trials to be conducted in the region, and taken pictures of the fish species in the seas around Turkey. The samples include fish and invertebrate species that have been measured and photographed by cell phone camera for the “CatchSnap” technology. The main outcome of the testing work West of Scotland was in the first two reporting periods the practical experience gained by staff working in demersal and shellfish fisheries in the case study area with the technologies developed elsewhere in the project in the last period. For the last reporting period, the focus has been on “CatchSnap” and “CatchMonitor” to evaluate their suitability and feasibility for use in fisheries in the region. For southern North Sea, Celtic Sea and Bay of Biscay, we have developed and evaluated the automated image analyses algorithms to assess the performance of different lights technologies and the effect of the lights on the behaviour of fish during the catching process, and conducted and completed testing and demonstration for the defined technologies. For Kattegat and Skagerrak fisheries, we have been able to do the practical test and demonstration of “FishFinder” and “TrawlMonitor” for the stakeholder group in the region, and we are planning for test and demonstrations in 2022.
Work towards going beyond the state of the art, as evidenced by progress so far, and we are progressing towards reaching the potential impacts of the project. However, a number of delays were present because of the Covid-19 pandemic. However, given the extra year being added to the project, we consider the testing phase more likely to be a success.
WP3 - testing lights technology
WP2 images of Nephrops
WP3 - testing light technology
WP4 - CatchSnap calibration board