CORDIS - Forschungsergebnisse der EU
CORDIS

An AR cloud and digital twins solution for industry and construction 4.0

Periodic Reporting for period 2 - ARtwin (An AR cloud and digital twins solution for industry and construction 4.0)

Berichtszeitraum: 2021-04-01 bis 2022-12-31

ARtwin provides the European Industry and Construction 4.0 with an ARCloud platform deployed on a private distant or/and edge cloud that offers three key services: (i) an accurate and robust 3D registration for any AR device in large-scale and dynamic environments, allowing to present relevant information to workers at the right time and place, (ii) reduction of the difference between the physical and digital world by continuously maintaining the Digital Twin/BIM model based on vision sensors available in the factory or on construction sites, (iii) display of complex 3D augmentations on any AR device by remotely rendering them in the cloud with ultra-low-latency. During the project, the partners developed a sparse and dense mapping service corresponding to an area of 250m2 by two devices (HoloLens2 and an IPad). The relocalization of the ARtwin map reconstruction achieved a perceived 3D registration less than 5cm of position accuracy and <2o of orientation accuracy. Localization, sparse and dense mapping met all project targets, and we employed them to create digital twins of pilot sites. The ARtwin platform has been validated in operational environments through two use cases in Industry 4.0 and one use case in Construction 4.0. The fast planning and adaptation of changes with fewer iterations depending on scenarios helped to accelerate production line re-planning task by an estimated 50% more. Most of the pilot participants indicated that the navigational functionality after alignment of AR content using spatial computing services increased productivity to an estimated 10% more.
The work in the frame of ARtwin kicked-off with a well-informed analysis of the initial requirements of the project’s use cases through a user-driven approach. In parallel, technical aspects were also investigated, by describing the state of the art related to 3D Map for camera relocalisation and 3D dense reconstruction, setting the premises for the project’s global map service and its specifications. Building on the initial use case and service specifications, the structure of the ARtwin platform follows a Platform of Platforms structureThe ARtwin ARCloud platform is comprised by five interconnected service components for spatial computing, namely the (i) the mapping service, consisting of a vision pipeline to create a local sparse map in real-time from images captured by vision sensors embedded in AR devices, (ii) the map update service, which fuses the local sparse map produced by each AR device in a global map at large scale, (iii) localization service consisting of vision algorithms processing the images captured by vision sensors embedded in AR devices and using the global map to estimate the position and orientation of the AR device at runtime, (iv) dense mapping service, semantic segmentation services, and their components consisting of vision algorithms processing the images captured by vision sensors embedded in AR devices stored in the global map as so as the simulation tools to verify proposed algorithms and retrain segmentation networks, and (v) remote rendering service which is able to render at the edge complex 3D model to be displayed to the end-user. The three first services have been integrated into an all-in-ones 5G AR edge through various implementations (flight-case and outdoor telecom cabinet form factors) including GPU, CPU, and storage. The demonstrator for the AR enabled factory worker use case was executed in the Digital Automation Lab in Karlsruhe in Siemens. The demonstrator for the Dynamic production line planning use case was executed in the Siemens Gerätewerk factory in Erlangen. The demonstrator for use case for the better constructions with AR enabled workers to detect and solve defects during constructions have been realized in a construction site in Rennes. A tailored dissemination and communication strategy has been implemented to promote the results of ARtwin, including high rated scientific publications (e.g. best paper award at IEEE CVPR 2022), promotional materials, a web portal and social media accounts, videos, newsletters and articles and mutually beneficial synergies with relevant initiatives and networks. A comprehensive analysis for the ARtwin’s market was conducted. ARtwin Business Plan and IPR & Exploitation agreement has been prepared, with a view to pave the way to the successful exploitation of ARtwin results beyond the lifespan of the project.
Although localisation and 3D mapping algorithms based on SLAM techniques are now industrialised in many AR devices, it is a very active research area as some hard challenges remain for addressing large scale and dynamic environments. Augmented reality working anywhere, anytime and on any device is far from being solved, yet it is essential to the adoption of this type of technology, whether for B2B or B2C uses. While most recent research work implements machine learning approaches that can hardly be generalised for any type of environment (learning is required for each usage environment), very few research work focus on the distribution into the cloud of vision pipelines to address both large scale and dynamic environments challenges with a crowd-mapping approach that maintains an up-to-date global map. The ARtwin project provided a sovereign AR cloud platform implementing a crowd mapping approach at the scale for a factory or a construction site, compliant with any AR device. A set of innovative services based on the most recent state-of-the-art work will be deployed onto this platform to provide end-users with robust and seamless AR applications anywhere in the factory/construction site. Moreover, the 3D maps used for AR device relocalization will be reused to update the 3D mock-ups of the digital twin based on last observations captured by vision sensors embedded in AR devices. This CAD/BIM update service is able to segment the 3D map built from AR device observations based on a supervised machine learning approach requiring the building of annotated learning data bases specific to the execution environment. To ease the process, the ARtwin project has implemented a solution based on semantic dense reconstruction to automatically generate a huge amount of annotated 2D observations controlled by virtual agents. Finally, an innovative remote rendering service, that leverages a depth map to enhance the quality of experience on none ultra-low latency network, allows the display of very complex CAD/BIM models on low-resources AR devices by moving this costly computation to remote GPU servers. The use of 5G has provided the required low-latency connectivity ensuring a high quality of experience. The results of the ARtwin project pave the way for a new generation of AR technologies that work anywhere, anytime and on any device. The impact on industry and construction 4.0 could be decisive, as the use of AR could be democratised to a wide range of tasks, leading to significant gains in productivity and quality, and thus improving the competitiveness of European companies. Moreover, although the project is mainly targeted at industry and construction, its results could have a significant impact on many other professional fields (e.g. Aeronautics, shipyards, energy, military, and health) as well as on many mass market applications (e.g. entertainment, cultural heritage, retailing, and social networks) while preserving data confidentiality on a sovereign European platform.
ARtwin_Logo