Training is a key issue in any task. Surgery training is even more critical since inexperience could lead to fatal consequences. Cadavers lets trainees to get experience in general tasks, but they are an expensive resource and lack of reproducibility. Moreover, dead tissue is usually harder, and arteries or nerves do not react. Virtual Reality (VR) can solve these problems. VR has succeeded in areas such as flight simulation, architecture or chemistry. Complexity of human interaction has prevented surgical simulation to grow up. New VR devices simulate touch sense and high-performance computing (HPC) enable to use on a real-time basis accurate Finite Element models from real pathologies. This project proposes a VR simulator, mainly focused to maxillo-facial surgery. It will enable the user to hold real surgical tools attached to force-feedback devices which will simulate the haptic sense.
At technical level, the objectives are:
- To implement numerical parallel algorithms for the FEM model processing.
- To integrate the haptic devices to simulate the force feedback of the surgery intervention.
- To design a surgery training scenario.
At business level, the objectives are the following:
- To reduce the costs that imply the use of cadavers.
- To improve the quality of the training by means of automatic evaluation and repeatability.
- To be able to train on specific interventions in which cadavers are not useful.
- To be provided with a flexible tool that can be dynamically updated to many surgical subjects.
Virtual Reality (VR) environments can help surgery training overcoming these constraints. The real feeling can be fairly simulated, and the possibility of working with models from real pathology images and the ability to redo actions can complement the skills of the trainees. This project proposes to create a VR visualisation system to simulate the interaction of tissues and surgical tools, mainly in specific areas such as maxillo-facial surgery. The proposed simulator will enable the user to hold real surgical tools to touch, probe, grasp, cut and suture anatomical structures. The surgical tools will be attached to force-feedback devices, so the user can feel the tissues during interaction. Soft tissues deformation and bone interaction can help extremely surgeons to improve their techniques to reduce the esthetical impact on the patient's face. To increase the reliability of the models, very large real medical images will be used. To provide real-time interaction, which is crucial for surgery training, a parallel computing visualisation kernel will be used in the project. Soft and hard tissues will be modelled using Finite Elements-like (FEM) techniques. These numerical techniques are very computing intensive, but an extensive work has been done to reduce the processing latency by means of parallel computing. Surgery tools will be provided of spatial positioning devices to provide with the 3D interaction. Moreover, the system can be extended with stereoscopic images, combined with the proper glasses will provide the immersive feeling required for a complete training. To implement the VR medical surgery training system. The hardware required is composed of a large-screen monitor, a High-Performance PC-based Cluster for medical image rendering and one or two haptic devices. The system will use a surface rendering visualisation algorithm. The experience of the consortium in efficient medical image processing has been proven in previous projects.%
A VR Surgical Simulator for maxillo-facial surgery with high-quality realistic graphics and haptic feedback with a cost under 45KEuro.
MR1. (PM8): The surgical case to be used for testing must be completed.
MR2. (PM12): A first prototype of the parallel system for the simulation of tissue deformation and the prototype for the haptic feedback must be ready
MR3. (PM14): The MR2 prototype should be completed.
The project has leaded to the implementation of a general purpose VR surgery simulator that is customisable for any patient’s anatomy, provided of the appropriate medical images. The system runs on real-time on high-performance standard PCs and do not have very expensive requirements. The system provides haptic feedback for training motor skills and uses advanced methods, such as Boundary Element Methods for a realistic simulation of the deformation of the organs. Camera position can be controlled by 3D localisers. The project ended-up with 3 modules for obtaining models of the organs from medical images, building-up scenarios and simulating the intervention. The dissemination of the project has been performed in 5 events and 6 publications. The end user has tested the system on several cases and has positively reacted to its future adoption. Navimetric is planning both renting the simulation facilities for laparoscopy surgery courses and setting-up complete installations.
Funding SchemeACM - Preparatory, accompanying and support measures