Periodic Reporting for period 4 - BODY-UI (Using Embodied Cognition to Create the Next Generations of Body-based User Interfaces)
Reporting period: 2019-11-01 to 2020-04-30
The second strand of work has concerned the design and prototyping of new and better body-based input (O2/WP2). We have characterized how users placed and recalled 30 items on the hand and forearm (Bergstrom-Lehtovirta et al., 2017). We have also focused on augmented and virtual reality because these technologies use the users’ body extensively and because they allow us to modify the appearance of the users’ body (by changing their representation in VR, their avatar). We have developed an approach to adapt an avatar iteratively (McIntosh et al., 2020), becoming more effective at the task after a number of iterations.
Part of the work in the project has concerned body-based output, corresponding to objective O3/WP3. This part of the work has in focused on electrical muscle stimulation. We have developed BodyIO, a prototype wearable sleeve of 60 electrodes for asynchronous electromyography (EMG) and electrical muscle stimulation (EMS); see Knibbe et al. (2017). BodyIO enables the creation of multi-channel, multi-electrode stimulation patterns enabling a wide range of wrist, hand, and finger gestures through one device. BodyIO also explores the interplay of reading and writing bio-electrical signals.
EMS is not the only modality we have explored. In particular, we have devised a system called Magnetips (McIntosh et al. 2019), that enabled users to use their fingertip as an input to the device, in a three-dimensional space around the device. To achieve this, we affixed a magnet to the fingertip, and used magnetic sensing to approximate the position of the finger, relative to the device.
The project has also been concerned with how to evaluate body-based user interfaces, corresponding to project objective O4 and workpackage WP4. The most important result here is that we have introduced a measure for tool extension in HCI by using a visual-tactile interference paradigm (Bergström et al. 2019). In this paradigm, an index of tool extension is given by response time differences between crossmodally congruent and incongruent stimuli; tactile on the hand and visual on the tool. We use this measure to show that findings on tool extension apply to interaction with computer-based tools.
Finally, our work has led to some broader ideas about what human-computer interaction is and the role of body-based user interfaces in that field (O5 and WP5). We have developed a meta-scientific account of HCI research as problem-solving (Oulasvirta & Hornbæk, 2016). We argued that most HCI research is about three main types of problem: empirical, conceptual, and constructive. We elaborated upon Laudan’s concept of problem-solving capacity as a universal criterion for determining the progress of solutions (outcomes).
Second, we have applied the problem-solving view to the concept of interaction (Hornbæk & Oulasvirta 2017); field-defining, yet curiously confused and under defined. We argued that attempts to directly define interaction have not produced agreeable results. Still, we have extracted from the literature distinct and highly developed concepts, for instance describing interaction as dialogue, transmission, optimal behavior, embodiment, and tool use. These has helped us think about the scope of interaction and what good interaction is.