Accurate knowledge of human functional anatomy, in physiological and pathological conditions, is of importance in many medical fields, as in industrial areas such as biomedical engineering. The 3D aspects of functional anatomy are particularly important but they are exceedingly difficult to present in a meaningful way. Thus, the visualisation and virtual manipulation of anatomical objects represent a challenge on which to test innovative techniques. The present proposal concerns the visualisation of, and interaction with, data related to musculo-skeletal structures by means of multi-modal and multi-sensorial interfaces. The project will create a user friendly visualisation and interaction environment, in which all the information is presented with a set of representation - interaction pairs inspired by medical imaging modalities known by biomedical professional to enable him to use his previous experience to enhanced effect.
The project will develop and validate a new representation & interaction paradigm for virtual medical objects using multi-modal and multi-sensorial interfaces. Instead of pursuing virtual REALISM, which would be unfamiliar to most professionals, this new representation paradigm will pursue clinical RELEVANCE. It will rely on multiple synchronised views, each one associated with specialised interaction modalities based on multi-body tracking, speech recognition and synthesis, and proprioceptive haptic interfaces. Some views will convey conventional medical imaging modalities (RX, CT, MRI, endoscopy), while others will involve unconventional representation of the organs. For each of them the multi-sensorial interaction, which is most effective in the application context, will be implemented, again considering the type of interaction the medical professional would expect in the real world.
At the outset, a review of the relevant literature will be performed for the visualisation strategies for virtual medical objects, for haptic interfaces, for speech interfaces, and for multi-body tracking, and combined with users requirements collected by means of interviews and surveys made among medical professionals. This activity will produce the description of the demonstrators and of the specific metrics to be used to evaluate the project results. The technology experts will deploy at the test site the hardware/software technologies for multi-body tracking, speech interfaces, haptic interfaces and autosterescopic display.
Such technologies will be integrated with existing VR and HPC facilities and used to develop three demonstration applications in the area of muscolo-skeletal pathologies. Using well-defined metrics, these applications will be used to compare the proposed representation & interaction paradigm to the conventional interfaces currently available in the same clinical contexts. Once the proposed interface paradigm is quantitatively evaluated with respect to relevant clinical applications, a new group of activities will begin to evaluate the acceptability of the proposed paradigm among medical professionals. Some partners are developing an application framework aimed at the rapid development of biomedical applications. The MULTISENSE project will add another conceptual layer aimed at supporting multi-sensory interfaces. The resulting software environment will be called Multi-modal Display and Interaction (MDI) application framework. In order to reduce the interdependency of this project from other activities, the Multi-sense software library will be designed to work also independently from the visualisation environment under development.
M1. Identification of a general clinical problem to be considered representative;
M2. Deployment and integration at the test site of the various multi-sensorial technologies;
M3. Realisation of demonstrators to be used in the validation study;
M4. Availability of all data to demonstrate conclusively the validity of the MDI paradigm, plus all the technical detail required to design an effective application framework;
M5. Complete application release.
Régime de financementCSC - Cost-sharing contracts
M5 4WT Salford
LU1 3JU Luton
OX2 0JB Oxford