Recently there has been a lot of excitement about ‘social robots’ which appear to naturally interact with people by following programmed rules, triggered in response to human behavioural cues. There are a number of scenarios where they could even replace humans, as more convenient or more cost-effective options, or in hazardous environments, for example. One potential market is shopping malls, where the eye-catching novelty of robots, combined with a range of customer-focused functions, could increase consumer engagement in this highly competitive environment. Yet, many social robots still fall short of real interaction, often serving as little more than glorified touchscreens. The EU-supported MuMMER project has designed a humanoid robot based on SoftBank’s Pepper platform which is able to interact autonomously and naturally with members of the public. “A problem with current consumer robots is that people expect more of a conversation. But with systems not set up to deliver this, they are often disappointed. When people approach the MuMMER robot they can have a real conversation,” says Mary Ellen Foster from the University of Glasgow, the project host and the project coordinator.
At the cutting edge of robot-human interaction
To understand the needs and challenges of multiple stakeholders, the team worked with customers, mall management and shop owners, who all co-designed the robot’s behaviour. These stakeholders also helped evaluate the robot’s performance. “These co-design sessions helped us understand that while some of our initial ideas were compelling, such as having a robot security guard, they were impractical for our chosen hardware platform,” explains Foster. “It became clear that a combination of guidance and chat, including support for selfies, was the appropriate model.” Consequently, the team concentrated on developing a mechanism for the robot to track and identify the bodily movements of people in close proximity, to determine whether they intended to engage or not. This allowed the robot to respond appropriately. In addition to running a social chatbot system to enable conversation, the robot was designed to guide customers to locations in the mall. Here, the team used ‘perspective taking’ to ensure that when the robot pointed to or referred to a landmark, a 3D model of the mall enabled it to give accurate directions.
The MuMMER system was deployed next to the information desk in the Ideapark shopping mall in Finland for several hours a day, for 14 weeks. To evaluate success, the project team conducted a survey amongst the key stakeholders, generating a wealth of data for future developments. While the results are still being analysed, broadly speaking the respondents were positive about the robot, despite some technical linguistic and computer vision challenges. “While the final version was technically limited, albeit eventually a fluent Finnish speaker, we still believe that this approach is the best long-term solution for successful social robots in public spaces,” notes Foster. The MuMMER team blended technical expertise from across specialisms including: audiovisual sensing; social signal processing; interaction management; navigation; and localisation. Crucially, the consortium also included both Ideapark as well as SoftBank Robotics Europe. Much of the underlying code developed by MuMMER is currently available as reusable open-source software for robotic components. The whole system could be available to partners, working with the project team, who are seeking to further explore robot-human interaction in public spaces. The analysis of the long-term interaction is due to be completed by the end of this year.
MuMMER, shopping mall, robot, Pepper, interaction, conversation, social signal, chatbot, COVID