European Commission logo
English English
CORDIS - EU research results
CORDIS

Children and social robots: An integrative framework

Article Category

Article available in the following languages:

Future friends? How children relate to social robots

With social robots predicted to become increasingly part of children’s everyday life, CHILDROBOT wanted to know how children actually interact with these technologies, why, and with what effects.

Digital Economy icon Digital Economy

Social robots – robots that interact with humans, often as animals or toys – are used in a range of circumstances. While they appear in care homes to help manage conditions such as dementia, they are also used with children; for education, as therapy (for example with autism) or as support for children receiving hospital treatment. Yet according to Jochen Peter from the University of Amsterdam, while research in this area is lively, it is often somewhat narrowly focused. “We lack a conceptual framework because the focus has often been practice-oriented rather than theory-oriented,” he explains. “Research had often looked at children’s acceptance and use of social robots, separately from the consequences.” Drawing on research from fields as diverse as robotics, psychology, and communication science, the CHILDROBOT project, funded by the European Research Council, set out to fill in the blanks. “Acknowledging that non-human actors, such as social robots, may perform more social role model functions, traditionally the domain of humans, our findings highlight the importance of a more transparent discussion about robots, with more responsible design,” says Peter. A key project success was validating that its research methodology, including self-reporting measures, was applicable to young children. These measures are now increasingly used by other researchers, with some of the project’s findings, especially those exploring children’s feelings of closeness with or trust in a robot, already published.

Theoretical grounding, experiments and follow-up surveys

CHILDROBOT’s studies focused on over 1 600 children aged eight to nine with the team drawing on interpersonal communication theories to help explain the processes and consequences of this new type of interaction. “For instance, children’s tendency to anthropomorphise robots and their perception that robots can understand what they think, partly explain how and why children form friendships with them,” notes Peter. The team conducted experiments, with one child interacting with one robot, followed up by surveys. In the experiments, in schools and museums, robots were preprogrammed to behave and speak with children in a certain way. Afterwards, children answered questions posed by trained interviewers. Some children also interacted with a social robot at home and were surveyed online every two weeks about the experience.

Three key insights

The team found that children’s acceptance of social robots is related more to their attitudes toward robots and their own social norms, than to individual personality characteristics, such as being more anxious. Additionally, interaction with robots doesn’t seem to be significantly driven by how ‘useful’ they are considered to be. Shifting the focus from prosocial behaviour toward a robot, to prosocial behaviour by a robot, CHILDROBOT also demonstrated that robots can enhance children’s prosocial behaviour as social role models. Additionally, communication, such as a robot’s questions or sharing (personal) information, can influence how children perceive and relate to robots. Transparency about a robot’s machine status, can reduce trust in the robot and sense of friendship with it.

Horizon scanning

CHILDROBOT’s results could help EU regulators assess this emerging technology before it becomes mainstream. “A simple application of our results could be that robots themselves tell children about their machine status, as part of their conversations with children,” adds Peter. “Given the rapid developments in generative AI – this should be a priority research area. AI comes with opportunities, such as more personalised interactions, alongside enormous risks, such as privacy violation and disinformation.” Finding that not all human-human communication research was applicable to child-robot interaction, Peter also suggests that a fruitful line of enquiry could be whether communication with robots follows different rules.

Keywords

CHILDROBOT, social robot, anthropomorphise, children, psychology, personality, communication, AI

Discover other articles in the same domain of application