Using robots to help teach children with special educational needs has proven to be an increasingly popular concept over the past decade. Evidence suggests that, for example, some children on the autistic spectrum are more willing to naturally interact with a robot than a human teacher. “The underlying idea is that robots appeal to the curiosity of children, attracting their attention more,” explains CybSPEED project coordinator Manuel Graña, a professor from the University of the Basque Country in Spain. “Children quite naturally accept robots as partners in the learning process. However, the technology behind human-robot interactions involving children is really still in its infancy.”
Improving robotic pedagogical tools
The CybSPEED project, which was undertaken with the support of the Marie Skłodowska-Curie Actions programme, sought to further develop and trial new robotic technology, specifically tailored to special needs education. This involved designing human-robot pedagogical games and storytelling capabilities, as well as interfaces to encourage greater interaction. For example, speech recognition was improved in the NAO robot – a small humanoid robot designed to interact with people – by applying alternative solutions not available in standard configuration. The project team also trialled eye-tracking and virtual reality technologies. “We then set about observing how children interacted with these enhanced robots,” says Graña. “We wanted to assess the interest of children, and see how communication with the robotic teacher could be improved.”
Measuring robot-human interactions
The robots – mostly based on the commercially available NAO robot – were trialled at a number of educational institutions, based on a carefully defined set of interactions. A key challenge was to accurately capture the impact of robot-human interactions, such as storytelling, on the children. To do this, the project team obtained EEG readings from children while engaged in an educational activity. This involved detecting electrical activity in the brain by using a comfortable wireless device. “We used non-intrusive, wireless EEG products that could easily be worn,” explains Graña. “In this way, we were able to study neural responses in the most natural setting possible. Some of our most salient results so far have come from experimental work carried out in Bulgaria and Greece and with large cohorts of children.”
Easing the educational burden
Some preliminary findings on brain-computer interactions and measuring the neural responses of children have already been published. Data is still coming in, and the project team hopes to provide more publishable results in the near future. The project has also made data sets and open-source codes available, to enable third parties to further advance the work pioneered in CybSPEED. “The results of this project will be of huge benefit to teachers, children and parents,” says Graña. “Robotic helpers can ease some of the burden on teachers of children with special needs. We have also shown that interactions with robotic teachers can motivate children and advance the learning process.” The project has helped to outline the research path ahead. More work is needed for example to make robots more robust, and further improvements in terms of communication – via speech, gestures or other communication channels – are required. “Forms of human communication are still rather difficult to replicate in robots,” Graña adds. Nonetheless, the project represents an important stepping stone towards the effective application of robot technologies in special needs education.
CybSPEED, robotic, education, autistic, children, pedagogical, neural