CORDIS - Forschungsergebnisse der EU
CORDIS

Gestures in Tangible User Interfaces

Periodic Reporting for period 1 - GETUI (Gestures in Tangible User Interfaces)

Berichtszeitraum: 2015-11-01 bis 2017-10-31

Coordinated by the Luxembourg Institute of Science and Technology (LIST) and funded by the EU programme Horizon2020, the project GETUI applies user studies to investigate the use of gestures on Tangible User Interfaces (TUIs) in the context of technology-based assessment (TBA) of collaborative and complex problem solving skills. TUIs provide tangible representations to digital information allowing users to control data by means of physical artifacts/objects. The connection between gesture studies and TUI-based collaboration and assessment is very close, as gesturing is a natural form of communication, and communication plays a crucial role in collaborative problem solving activities. Collaborative problem solving is defined as the capacity to recognise the perspective of other people in a group, participate, contribute knowledge, recognise the need for contributions, and build knowledge and understanding as a member in a collaborative setting. Most of the research in technology-based assessment of collaborative problem solving skills has dealt with improving the assessment of traditional skills. However, the focus should shift to so-called “21st Century skills”: complex problem solving (CPS), creativity, critical thinking, learning to learn, and decision-making. ICT can provide both the solution for assessment of these skills, as well as the problem, as there is a lack of scientific and practical knowledge about, on one hand, adapting existing assessment models and, on the other hand, creating appropriate, authentic problem solving situations.

The main objective of GETUI is to explore the gestural performance of users while interacting on a TUI with physical objects in a collaborative problem solving task and investigate its impact on 21st century skills. The physical objects are round objects, usually made of cardboard or laser-printed. Our TUI is a product from Multitaction and is realized as a tangible tabletop (75x120 cm), provides visual feedback in real-time, and displays the effects with pictures and animations. Empirical user studies with 60 participants from public schools in Luxembourg assess collaborative complex problem solving and reasoning skills and result in a multimodal corpus of speech and gesture. This corpus will be annotated and thus, a gesture taxonomy will be defined. Gestures are categorized into physical or free hand gestures (mainly pointing or iconic) and manipulative (rotating, tracing the objects on the TUI). The former concern human-human interaction, while the latter human-computer interaction. Last but not least, the outcomes of the study will provide input on which Interaction Design aspects of TUIs are effective for measuring collaborative problem solving and contribute to the development of assessment models and methods.
In the first six months of the project we prepared the planned user studies and conducted a trial user study with 15 participants. The preparation included, among others, the implementation of a simulation-based micro world scenario on the TUI. The scenario is about building a power grid. A group of three pupils (age 15-18) are provided with three types of physical, laser-printed, round objects which represent industrial facilities that produce electricity: wind park, solar park and a fossil fuel power plant. By installing the objects on the TUI, there are two parameters changing: i) the electricity generation and ii) CO2 emission. The pupils are then asked questions, such as which facility generates the most power, which CO2, etc. In addition, they are also asked to reach a target goal, e.g. 5,5 GW of electricity while emitting maximum 2 million metric tons of CO2. In this case they discuss more and perform more gestures, resulting in more active collaboration. Apart from the design and development of this scenario, organisational activities were conducted, such as contacting public schools, acquiring participants, preparation of verbal/written instructions, preparation of pre- and post-test questionnaires. The pre-study which took place in March provided us with useful feedback; hence we accordingly adapted the scenario as well as the instructions for the coming evaluation study.
As far as the results achieved so far, we have four papers accepted at the following conferences and workshops: International Gesture Conference, International Conference on Interaction Design and Children, Multimodal Corpora Workshop/Language Resources and Evaluation Conference, and Workshop on Movement and Computing. Moreover, a Master student worked in terms of GETUI for three months in the field of gesture recognition using the Kinect 2.0 depth sense camera. Last but not least, the PI has significantly extended her network both through cooperation with employees of the host institution and public schools in Luxembourg as well as the EU (is now member of the COST European Network for Research Evaluation in the Social Sciences and the Humanities).
GETUI is innovative in its approach that takes into account both the lack of tools and methods for assessing collaborative problem solving and interaction through gestures. The coupling of gestures with TUIs for collaborative problem solving assessment is a new direction in gesture studies. Many fields use gesture in their applications, such as telecommunications, entertainment, and healthcare, making the analysis and evaluation of gestures important both financially and socially. Filling the gap in scientific knowledge, GETUI’s results will be applicable for use in the international large-scale educational Programme for International Student Assessment (PISA) program, as well as in future projects and international collaborations. GETUI will also have an impact at a European level creating new instruments for assessing 21st Century skills based on capturing behavioural indices of HCI and human-human interactions in collaborative TBA situations. GETUI will contribute to the following fields: collaborative problem solving, gesture annotation, gesture evaluation, interaction design, localisation, multimodality, speech-gesture alignment, TBA, and TUIs.
DSC_5996.JPG