Periodic Reporting for period 3 - BabyRobot (Child-Robot Communication and Collaboration: Edutainment, Behavioural Modelling and Cognitive Development in Typically Developing and Autistic Spectrum Children)
Reporting period: 2018-01-01 to 2018-12-31
The BabyRobot project is a truly interdisciplinary effort bringing together experts from diverse areas working towards common goals. At the core of the proposed research agenda is cognitive robotics, i.e. designing and building robots that have adaptive communication and collaboration skills very much like humans do. For this purpose, we build the cognitive and communicative capabilities of robots layer by layer in direct analogy to human cognition as shown in Figure 1. Machine learning algorithms for speech and gesture recognition, speech understanding, socio-affective state recognition, planning and discourse modeling are some of the relevant component technologies for building such capabilities. In addition, in order to negotiate the semantics of the communicated information and the environment between the human and the robot three more tools are needed: joint attentional mechanisms, establishing common ground and sharing goals and intentions. When one or more of these tools are missing or underperforming, communication, collaboration and learning is ineffectual for humans and machines alike. Autism spectrum disorder (ASD) children are a prime example where the core recognition and understanding function are intact, but joint attention, common grounding and share intentionality mechanisms are compromised leading to poor performance in language learning, communication and collaboration tasks. This makes the ASD population a natural choice for this line of research, child and robot learning and enhancing their communication capabilities hand-in-hand.
The BabyRobot project is centered around three use cases to identify, ground, develop and evaluate the relevant set of technologies, as well as their application to child-robot interaction scenarios, specifically: 1) natural child-robot interaction scenarios that showcase the joint attention, common ground and shared intentionality modules, 2) communication skill development and learning via tactile and language games, and 3) collaboration skill development and learning via dyadic and triadic interaction, using the robot as a mediator. The target populations are typically developing (TD) and ASD children, ages 6-10 years interacting in their native language. Use cases 2 and 3 involve longitudinal studies in collaboration with educators (and therapists for ASD children) in order to formally evaluate and measure the progress in the child's communication and collaboration skills. The following languages are addressed in BabyRobot: English, Danish, Swedish and Greek.
Work has focused on researching, developing, implementing and gradually integrating parts of the above modules, in particular:
- Multi-person localization and tracking from audio-visual input
- Gesture and speech recognition
- Social, cognitive and affective state tracking from audio-visual data
- Common grounding by employing conceptual networks
- On-line adaptation and learning to optimize robot control policies on the fly based on monitoring of human behavior
Furthermore, significant results have been achieved in the following directions:
1) Application and pilot testing of all three use case scenarios, together with the respective data collection protocols and procedures.
2) Establishment of the general BabyRobot architecture under a modular, extendable and reproducible framework, as depicted in Figure 3.
3) Optimizing and releasing parts of the codebase that is technology ready.
4) Establishing a common research approach and assessment tools to systematically study in a coherent way a variety of targeted social and cognitive skills in children with ASD.
1) Joint attention modeling in human-robot interaction
2) Conceptual networks used for creating common ground as well as semantic representations for human-robot interactive systems
3) Robust multimodal situated algorithms for behavioral analysis using multimodal low- and mid-level cues (including paralinguistic cues motivated by social signal processing)
4) Intention reading and action recognition that incorporate multimodal low- mid- and high-level cues
5) On-line adaptation and reinforcement learning algorithms for optimal decision-making in human-robot collaboration settings
Contributions towards technology development, as well as towards identifying wider socio-economic implications of BabyRobot technologies, include:
1. Validating via the proposed use cases that the studied human-robot interaction scenarios are relevant for ASD and TD children for learning communicative and collaborative skills.
2. Investigating the exploitation potential of BabyRobot technologies in different edutainment applications in schools and special education.
3. Conducting a series of intervention studies, involving children with ASD, after carefully designing specific study scenarios and protocols.
Results are very promising, showing the great potential of these technologies and opening up a rich variety of possibilities to further apply such a framework in different real-world settings.