Skip to main content

An AR cloud and digital twins solution for industry and construction 4.0

Periodic Reporting for period 1 - ARtwin (An AR cloud and digital twins solution for industry and construction 4.0)

Reporting period: 2019-10-01 to 2021-03-31

Industry and Construction 4.0 have high expectations of AR technologies in terms of productivity gains and quality improvement. Numerous proofs of concept demonstrate significant returns on investment, but difficulties occur when it comes to large-scale deployment in operational dynamic environments exposed to variable lighting conditions. Unfortunately, the early-stage ARCloud implementations do not fully meet industry and construction 4.0 requirements and European players may face a lock-in situation if no sovereign solution is proposed to them.
In this framework, ARtwin provides the European Industry and Construction 4.0 with an ARCloud platform that meets their needs. This platform is deployed on a private distant or/and edge cloud ensures the privacy of information and offers three key services: (i) an accurate and robust 3D registration for any AR device in large-scale and dynamic environments, allowing to present relevant information to workers at the right time and place, (ii) reduction of the difference between the physical and digital world by continuously maintaining the Digital Twin/BIM model based on vision sensors available in the factory or on construction sites, (iii) display of complex 3D augmentations on any AR device by remotely rendering them in the cloud with ultra-low-latency.
The ARtwin platform and services will be validated in operational environments through two use cases in Industry 4.0 and a use case in Construction 4.0. The results of the format of the 3D map stored in the cloud will be submitted to standardization bodies to prevent lock-in situation with few vendors and encourage an ecosystem with a diverse range of solutions providers.
The work performed in the frame of the H2020 ARtwin project kicked-off with a well-informed analysis and documentation of the initial requirements of the project’s use cases, namely the field experimentations of the pilots in the SIEMENS factories and an EIFFAGE construction site. Following a user-driven approach, the use cases elaborated based on specific user stories, describing the scenario of the workflows to be followed, the system level requirements (i.e. functional, design and implementation), as well as the non-functional requirements (i.e. safety, performance, security, and software quality guidelines). In parallel, technical aspects of the project were also investigated, by describing the state of the art related to 3D Map for camera relocalisation and 3D dense reconstruction, setting the premises for the project’s global map service and its specifications.
Building on the initial use case and service specifications, the structure of the ARtwin platform was set in place; the ARtwin platform follows a Platform of Platforms structure and comprises different platforms. On top of that, tailored roles, permissions, and blueprints were identified to ensure the concrete operation of the platform.
The ARtwin ARCloud platform is comprised by four interconnected service components. The preliminary demo of the architecture and implementation of the services has been already defined, namely for the (i) the map update service, consisting of vision pipelines that create and update whether sparse or dense 3D maps based on images captured by vision sensors embedded in AR devices, (ii) localization service consisting of vision algorithms processing the images captured by vision sensors embedded in AR devices, (iii) dense mapping service, semantic segmentation services, and their components consisting of vision algorithms processing the images captured by vision sensors embedded in AR devices as so as the simulation tools to verify proposed algorithms and retrain segmentation networks, and (iv) remote rendering service.
In parallel, preparatory work has taken place towards the elaboration of an evidence-based business model and plan. Building upon a preliminary market research and supported by dedicated business validation workshops, ARtwin consortium concluded on its offerings and value propositions, investigated alternative formations of value chain and sketched a draft business model.
Last but not least, a tailored dissemination and communication strategy was implemented to promote the results of ARtwin, including promotional materials, a web portal and social media, issue of newsletters and articles and mutually beneficial synergies with relevant initiatives and networks.
Although localisation and 3D mapping algorithms based on SLAM techniques are now industrialised in many AR devices, it is a very active research area as some hard challenges remain for addressing large scale and dynamic environments. Augmented reality working anywhere, anytime and on any device is far from being solved, yet it is essential to the adoption of this type of technology, whether for B2B or B2C uses. While most recent research work implements machine learning approaches that can hardly be generalised for any type of environment (learning is required for each usage environment), very few research work focus on the distribution into the cloud of vision pipelines to address both large scale and dynamic environments challenges with a crowd-mapping approach that maintains an up-to-date global map.
The ARtwin project aims at providing a sovereign AR cloud platform implementing a crowd mapping approach at the scale for a factory or a construction site, compliant with any AR device. A set of innovative services based on the most recent state-of-the-art work will be deployed onto this platform to provide end-users with robust and seamless AR applications anywhere in the factory/construction site. Moreover, the 3D maps used for AR device relocalization will be reused to update the 3D mock-ups of the digital twin based on last observations captured by vision sensors embedded in AR devices. This CAD/BIM update service is able to segment the 3D map build from AR device observations based on a supervised machine learning approach requiring the building of annotated learning data bases specific to the execution environment. To ease the process, the ARtwin project will implement a solution based on semantic dense reconstruction coupled with machine learning based approach improving the realism of the rendering of the 3D point clouds to automatically generate a huge amount of annotated 2D observations controlled by virtual agents. Finally, an innovative remote rendering service will allow the display of very complex CAD/BIM models on low-resources AR devices by moving this costly computation to remote GPU servers. The use of 5G will provide the required low-latency connectivity ensuring a high quality of experience. The results of the ARtwin project pave the way for a new generation of AR technologies that work anywhere, anytime and on any device. The impact on industry and construction 4.0 could be decisive, as the use of AR could be democratised to a wide range of tasks, leading to significant gains in productivity and quality, and thus improving the competitiveness of European companies. Moreover, although the project is mainly targeted at industry and construction, its results could have a significant impact on many other professional fields (e.g. Aeronautics, shipyards, energy, military, and health) as well as on many mass market applications (e.g. entertainment, cultural heritage, retailing, and social networks) while preserving data confidentiality on a sovereign European platform.
ARtwin_Logo