Skip to main content

Assembly Planning and SIMulation of an Aircraft Final Assembly Line

Periodic Reporting for period 1 - SIMFAL (Assembly Planning and SIMulation of an Aircraft Final Assembly Line)

Reporting period: 2017-02-01 to 2018-07-31

The main objective of SIMFAL project is to analyse, plan and optimize automated assembly tasks of cabin and cargo interior parts with a coexistence of human workforce facilitating working places where workers are collaborating with automation systems and enhancing the total system (human and automation).
To do that, the project will set up a simulation environment based on VR and AR to display and evaluate alternative process scenarios. On one hand, a VR environment will be developed for simulating the assembly of the interior of an aircraft. The resulting worker operational procedures will be enhanced with the simulation of the coexistence of the worker and a robot during aircraft assembly processes. The results will feed an AR system. This tool will be used for worker assistance during the assembly process visualizing/monitoring actual information in real-time. These two tools together will enable to evaluate different assembly alternatives, choosing the best one in terms of productivity and health and safety of the worker (ergonomic conditions). As a result, real collaboration between workers and machines will be able to be a fact in the near future. SIMFAL will also facilitate the adaptability of both the workers and the machines to unpredicted situations: For an unpredicted situation, the platform will provide the necessary instructions to guide the workers to better fulfil their needs and required support.
WP1: All results of this WP1 are described in “D1.1 - SIMFAL Specifications Document”. A summary of specifications is:
Working environment: 3D files of the Cabin & Cargo demonstrator
Virtual automated systems description to be considered as well as specifications (3D model, size, kinematics, how is going to move)
Current assembly procedure definition (tasks, subtasks, planning, tools, etc.)
Tasks performed in the cabin & cargo, identifying which are in non-ergonomic conditions and which are the most promising to be performed by automated systems
Worker profiles definition (age, gender, corpulence, expertise)
WP2: This work package focuses on the virtual reality simulator. Partial results of this WP2 are described in “D2.1 - Simulation of working process environment” and “D2.2 - Simulation of assembly planning with VR and monitoring of robot/worker”. A summary of these results are:
Models of the simulator
The different 3D models have been imported in the simulation: aircraft fuselage (cabin and cargo), the mechatronic systems (robots and platforms) and the new designed parts (sidewalls, ceilings, hatracks, etc.).
Interaction with the simulator
VIVE is a virtual reality platform developed by HTC and Valve for total immersion in virtual worlds. Using a headset and wireless controllers, it’s possible to explore and interact with VR experiences in a very immersive way.
Movement virtual reconstruction and evaluation
To achieve the ergonomics assessment, a tool for automatic analysis and visualization of motion data was developed to integrate data captured from the mocap system into the VR platform. This ergonomics tool incorporates the Ovako Working Analysis System (OWAS) which evaluates physical stress during a job task, and allows visualizing the worker movements and the risk of the current posture by calculating it instantly and color-encoding the avatar.
System evaluation based on agents
The developed virtual environment has now a timing functionality, which records every part that moved on the scene. This functionality is called “agents”.
Task Manager
To define and control the tasks that the different actors of the simulation should perform in order to assemble the panels and storage units, a task management system has been developed on Unity.
WP3: A study of traditional assembly procedures for cabin interior parts in a FAL was carried out during a 2 days stay at Airbus plant in Hamburg. To perform this study, an A320 aircraft was available, and the data of 5 workers with different characteristics to carry out the assembly work was taken with the mocap system. The 18-neuron configuration of the MoCap was selected for the whole body. Each neuron contains a gyroscope, accelerometer and a response framerate of 120 fps. The motion data was captured at 10 frames per second and saved to a JavaScript Object Notation (JSON) file nd then loaded into the ergonomics tool for analysis. Partial results of this WP3 are described in the “Internal Report IR2 – Ergonomics Study”.
WP4: For the AR application development, we will use Unity in conjunction with the Holotoolkit for the Microsoft HoloLens device. Holotoolkit is a collection of tools, provided to simplify the developing of applications to deploy on the HoloLens device. This toolkit has been integrated with Unity as it comes as a package with its own menu.
WP6: It is necessary the development of a middleware that allows the integration of the solution with the different IT components. A general data communication content has been developed. Every mechatronic device is an OPC UA Server connected to an OPC UA Client that manages all the generated data from every part.
SIMFAL project will have several results beyond the state of the art. Some of them (classified by technologies) are:
Workload distribution: SIMFAL proposes developing and evaluating different workload distribution approaches between human and machine and finding the optimal solution in terms of productivity adapting the automated system to the specific worker, taking into account other influence factors like ergonomics and the worker satisfaction.
Virtual Reality simulator: The VR simulator will be used to design and validate the different coexistence workloads distribution in SIMFAL, and also will generate adequate information to be used during the AR guiding tool.
Augmented Reality tool: The system will propose alternative actions when unexpected situations arise using the information obtained in the analysis sessions using Virtual Reality.
Computer Vision algorithms: New available approaches to detect objects in industry will be integrated in the platform to guarantee that the information presented to the worker is correctly presented and integrated with the surrounding environment.
Concerning to the socio-economic and societal potential impacts:
Industrial and Financial impacts: The SIMFAL results will contribute, at the manufacturing stage, when physically implemented, to have faster (around 25 % time saving) and cost-efficient (from -10 to -30% of assembly cost) aircraft assembly processes. On the other hand, there are other industrial benefits to be mentioned such as: the reduction of time needed for assembly tasks design in the engineering stage (up to 50%), and a lower non-quality costs (estimated around -10%)
Environmental impacts: SIMFAL will reduce in the long-term the CO2 emissions and industrial wastes due to the process optimisation.
Impacts on health and safety of workers: The SIMFAL simulation tools will allow simulating different scenarios and visualising the processes, being possible to evaluate the result of assigning to robots some of the non-ergonomic operations. On the other hand, the AR based simulation software will allow capturing the best work practices, something that will affect to workers: better working postures and more efficient work (same result with less effort / less hazard). As consequence, the costs associated to occupational diseases linked to postural issues will be reduced.