European Commission logo
English English
CORDIS - EU research results
CORDIS
Content archived on 2024-05-29

OPENINTERFACE

Exploitable results

In OpenInterface (OI), we studied multimodal interaction on mobile phones as well as multimodal interaction on an augmented table, at home for example navigating in a map looking for a restaurant. Facing the vast world of possibilities for interaction modalities, tools for integrating and combining those modalities become a real challenge addressed in the OI project. Focusing on engineering multimodal interaction, the key results of the project: - The OI framework for the rapid development of multimodal interaction: The open source framework available and fully documented at http://www.oi-project.org is made of an underlying platform, namely OpenInterface platform, and the OpenInterface Interaction Development Environment (OIDE). The key aspect of the underlying component-based platform is to handle distributed heterogeneous components based on different technologies (Java, C++, Matlab, Python, .NET). It allows the integration of existing interaction modalities written in different languages. The OIDE adds development tools offering access to interaction capabilities at multiple levels of abstraction: it includes a component repository, a graphical construction tool, and debugging as well as logging tools. Some evaluations of the OIDE itself have been conducted. - The multimodal hub and multimodal browser: going further than prototyping multimodal interaction on mobile phones using our OI framework (running on PC), the multimodal hub and browser implement a new protocol (standard extension) and are running on mobile phones. There are J2ME and Android implementations as well as an open source Python implementation. Such a tool enables us to directly reuse OI components for developing multimodal interaction on mobile phones. These tools have been used to explore various multimodal interaction based on a large set of innovative devices in the context of testbeds (four versions of test beds for multimodal game and navigation in large information space) and of two validators fully running on mobile phones. Such an experience also leads us to collect experimental data on multimodal interaction during the in-laboratory and in-field evaluation that we carried out.

Searching for OpenAIRE data...

There was an error trying to search data from OpenAIRE

No results available