Periodic Reporting for period 1 - SMARTsurg (SMart weArable Robotic Teleoperated Surgery)
Reporting period: 2017-01-01 to 2018-06-30
The possible socio-economic impact of this technology is the improved patient outcome, which is currently low compared to open and laparoscopic surgery.
Objective 1: To develop a dexterous, adaptable anthropomorphic surgical instrument
Objective 2: To develop a framework for providing haptic feedback from the surgical instrument to the surgeon
Objective 3: To develop strategies for dynamic active constraints construction and their guaranteed satisfaction
Objective 4: To develop advanced cognition and perception abilities to achieve the real-time and on-the-fly reconstruction of the operation area
Objective 5: To validate SMARTsurg project results in realistic scenarios involving procedures on different surgical domain
Design and definition of SMARTsurg Quality Assessment Plan (CERTH)
Project Reference Manual and Quality Assessment Plan
Preliminary Ethics and Safety Manual for SMARTsurg Technology
Ethics and Safety Manual for SMARTsurg Technology
Use Cases, Framework Requirements and Specifications elicited through 29 interviews with surgeons
The conceptual architecture of the SMARTsurg system
Surgical workflow recognition and context-aware system and Graphical User Interface
State-of-the-art analysis on 3D reconstruction in endoscopic applications.
Market research for endoscopic cameras with 3D functionalities
The modification of the quasi-dense stereo 3D reconstruction method to achieve real-time performance
Setting-up of an arthroscopic tower and knee model provided by TheMIS for the duration of the project. .
Employing publicly available datasets of pre-operative image data (CT scans, MRIs) for creating corresponding 3D models.
Employing CT image data (provided by TheMIS) of the knee model for creating the corresponding 3D models of the volumes
Development of exoskeleton masters using different materials and sensing methodologies
Development of a kinematic model of the new exoskeleton – fingers + wrist
Setup of SMARTsurg teleoperation system
Design of a BRL laparoscopic phantom
Design of a first prototype of anthropomorphic tool based on user requirements
Clip-on tool for the assembly of the robot arm for various surgical tools designed
A module to segment abdominal organs and tumour has been developed in 3D Slicer
Implementation and evaluation of selected distance measures for safe region during operation.
A software package with 4 state of the art force generation methods for active constraints.
A methodology for using point clouds for approximating the constraint surfaces
A method to utilize point clouds of a kidney and surrounding vessels.
Force measurements tests of Haption Virtuose 6D desktop
Design and analysis of a variable stiffness fingertip haptic device (FHD).
Sensor-less force sensing of instrument grasping
Modelling and control of the instrument positioning through the flexible fulcrum point.
Exploration of haptics in suturing in a 3D virtual reality environment
Development of a protocol extraction framework for Robot-assisted MIS procedures
Design of the related GUI to interact with the protocol definition/verification system
Context aware system framework for surgical training with ontology
System for automatic generation of rules to learn surgical workflow model
Integration of the endoscopic system to the remaining SMARTsurg Platform
Preliminary design of a gantry that will hold 2 slave robot arms and the camera holder
A basic bilateral master-slave teleoperation scheme using the ROS packages.
Evaluation of the use of active constrains.
Evaluation and comparison of components related to 3D imaging and visualisation
Logo/ Templates of deliverables, Leaflet and poster were produced
Smartsurg Newsletter (two issues) and two promotional videos
Project website has been created
Smartsurg twitter and linkedin pages were created
Internal mailing lists have been setup
Project repositories have been setup
Partners disseminated SMARTsurg in workshops, conferences and journals.
A prototype of a wearable exoskeleton controller for intuitive motion control of the instrument
An algorithm to estimate tooltip-tissue force interaction at the surgical site without the use of sensors. The current RAMIS systems do not have tooltip-tissue force interaction.
A novel fingertip haptic device has been designed which provides tactile and force feedback to the wearer.
Design of a methodology that can be used to enforce the active constraints on point cloud-approximated surface for the non-violation of the constraints.
An unprecedented detailed workflow of seven surgical procedures, which can take advantage of robotic assisted surgery, AC implementation and augmented reality
A context-aware software framework for intelligent surgical training system which uses the knowledge representation, computer vision and semantic web technologies
The Graphical User Interface has been devised identifying the required modules (buttons). The advancement is the modular structure and easiness of use.
Advancement of the state of the art in the optimization of human arm-robot interaction by adaptation to arm impedance.
Implementation of state of the art modules for the vision system with VTK and ROS for communication. Modules include Camera Calibration
Definition of AC models (volumes, mesh) based on preoperative medical images. Advancement is definition of AC models by ‘brushing’ the surface on 3D virtual images.
Inductive logic programming to construct a surgical process model based on video annotations and to automatically analyse the surgical workflow
Expected results until the end of the project and potential impacts
Extra degrees of freedom for the 3-fingered surgical instrument . RAMIS surgical procedures can not currently be conducted due to the limitation of current RAMIS instruments.
The wearable exoskeleton to control the motion of the 3-fingered surgical instrument. This provides intuitive control and allows more ergonomic postures compared to the current state of the art.
The ‘sensorless’ force estimation algorithm for haptic feedback which can reduce forces exerted on patient tissue and enable active constraints to be enforced to the surgeon.
The SMARTsurg system will be equipped with the necessary Dynamic Active Constraint Enforcement capabilities to prevent the robot tool to damage adjacent tissues.
An online recognition of surgical workflow at the different granularity levels, especially surgical phases,steps, and actions, to allow context awareness, decision support and learning of the procedural workflow.