Skip to main content

B-reactable: multimodal tabletop system for collaborative physiology monitoring and training

Final Report Summary - B-REACTABLE (B-reactable: multimodal tabletop system for collaborative physiology monitoring and training)

Rapid development within the biomedical engineering field, especially Brain-Neural-Computer Interaction (BNCI) area, provides a solid technological base for new applications aimed at improving health and quality of life. In this project, aimed at combining research and training components, Dr. Väljamäe together with PhD student Sebastian Mealla designed, validated and optimized a novel multimodal system - B-Reactable - linking a tangible musical tabletop interface with BNCI technology for collaborative physiology monitoring and training in future health and professional applications. This interdisciplinary research project is based on the joint pilot work with Prof. Jorda, University of Pompeu Fabra, Barcelona in 2010-11. In the envisioned B-Reactable applications, users will explicitly or implicitly learn to monitor and control their physiological signals using tangible objects, and hence, understand and influence their cognitive or emotional states.

During the first year of the project, Dr. Väljamäe worked at Department of Higher Nervous Activity and Psychophysiology, Saint Petersburg State University, collaborated with Russian Academy of Science (Sechenov Institute of Evolutionary Physiology and Biochemistry) and continued to collaborate with Sebastian Mealla and Sergi Jorda at MTG-UPF. The project started with the thorough published review of the available real-time EEG sonifications. At the same time, the additional experiments have been performed using methodology of 2011 where between-groups analysis showed that the experience brought by B-Reactable is stronger than the one of the conventional tangible interface (Reactable), and that fake physiology-based feedback significantly reduces “physipuck experiences”. Finally, a new study using alpha-theta neurofeedback showed that model based approach to sonification of EEG features can be successfully used. Hence, it was decided to deploy model-sonification approach in B-Reactable prototype of the second year. As an EEG device, a 21-channel Mitsar amplifier from Saint Petersburg’s company has been selected and tested. Apart from the research results, the secondment period in Russia enriched the applicants set of physiology processing tools (e.g WinEEG), and allowed to establish good connections with leading neuroscience labs in Moscow and Saint Petersburg.

During the second year of the project, Dr. Väljamäe worked at Decision, Emotion and Perception lab (DEP lab), Department of Behavioural Sciences and Learning, Linkoping University, Sweden and collaborating with MTG-UPF. In addition, collaboration on EEG sonification in neurofeedback applications was established with Tony Steffert and Dr. Simon Holland from Music Computer Lab, Open University, UK. B-Reactable prototype was tested with simple users and professional Reactable DJs (https://vimeo.com/100705384). In parallel, together with project coordinator, Prof. Vastfjall, the research plan was revised and a new paradigm for collaborative neurofeedback-based compassion training has been conceived aligned with DEP lab research. In this paradigm, real-time monitoring of EEG activity using sonifications developed earlier is now combined with presentation of emotional videos used for pro-social behaviour training. In addition, together Mr. Smetana, director of Centre for Art and New Technologies – CIANT, Dr. Valjamae established a virtual platform to connects art and science activites – Neuro Genetic Media Lab (http://www.ciant.cz/index.php/cz/ngmlab-cz). The technical framework on collaborative NF-training is currently used to create a new audio-visual media installation by Mr. Smetana called BrainOpera to be presented in 2015.