IST Results – feature articles
- ACTIPRET: Interpreting and Understanding Activities of Expert Operators for Teaching and Education
- CogVis: Cognitive Vision Systems
- CogViSys: Cognitive Vision Systems
- COSPAL: Cognitive Systems – Perception, Action, Learning
- CoSy - Cognitive Systems for Cognitive Assistants
- ECVision: European Research Network for Cognitive AI-enabled Computer Vision Systems
- Feelix Growing: Feel, interact, express: a global approach to development with Interdisciplinary grounding
- LAVA: Learning for Adaptable Visual Assistants
- MACS: Multisensory Autonomous Cognitive Systems
- RobotCub: Robotic open-architecture technology for cognition, understanding and behaviours
- SENSOPAC: Sensorimotor structuring of perception and action for emergent cognition
- SPARC - Spatial-Temporal Array Computer based structure
- VAMPIRE: Visual Active Memory Processes and Interactive REtrieval
Projects in other media
Feelix Growing: FEEL, Interact, eXpress: a Global appRoach to develOpment With INterdisciplinary Grounding
Recently, the work of Feelix Growing was featured on the TV programme Futuris on Euronews, one of the premier news services in Europe. To see the 8:30min video of the broadcast, please click here.
Patients recovering from surgery or injuries may soon be able to physically play their way to a full recovery with intelligent robotic systems that generate specialized games to challenge the human body's abilities.
Henrik Hautop Lund, a robotics and artificial-intelligence professor at the University of Southern Denmark is developing therapy tiles that guide patients through physical routines and help them heal.
Each tile is a miniature robotic system employing neural networks. The system looks like an elaborate, electronic version of Twister. As patients step on or press the tiles with their hands, the tiles give feedback, indicating whether ther pressure is firm enough, or if the user is moving quickly enough. Individuals can use the game alone, or up to four patients can compete against each other in a game. The tiles can be assembled in any configuration on the walls and floor to create an intelligent game space.
A robot with empathy sounds like the stuff of sci-fi movies, but with the aid of neural networks, European researchers associated with the ‘Feelix Growing project’ are developing robots in tune with human emotions.
The project involves itself with developing software that will lead to robots that can learn when a person is happy, sad or angry.
The learning part is achieved through the use of artificial neural networks, which are well-suited to the varied and changing inputs that ‘perceptive’ robots would be exposed to.
- Quoted from 'Emotional robots' (© Mumbai Mirror 2008)
European researchers are developing software that they hope will one day make it possible for a robot to learn when a person is sad, happy, or angry.
The work is being done by the Feelix Growing project, which involves six countries and 25 robotics experts, development psychologists, and neuroscientists. Along with the software, the project is developing "neural networks" of cameras and sensors that help the robots detect a person's facial expressions, voice, proximity, and other parameters to determine emotional state, according to ICT Results, a news service created to showcase European Union-funded research.
Researchers in the 3-year-old project are building demonstration robots as proofs of concept. One of the demos follows researchers around, learning from experience when to trail behind or stick close. Researchers also are working on a robot face that can simulate emotional expressions.
Some say a smile (as well as a picture) is worth a thousand words. This just might be true indeed for a group of researchers involved in a groundbreaking project: teaching robots how to interact emotionally with humans.
The researchers are creating a robot that can read human expression, detecting when a person is happy, sad, scared or angry, then reacting accordingly.
Though this may look like the sort of stuff found in the movies, it is actually a very practical endeavour, say scientists.
Dr Lola Canamero, the coordinator on the project, entitled Feelix Growing, says: for machines to take on bigger roles in society, such as domestic work, tending to the sick, the elderly, people with autism, or even just for entertainment, they need to learn how to adapt to people's behaviours.
Project COSPAL: Cognitive Systems - Perception, Action, Learning
European computer scientists have developed a new kind of cognitive robots. This learning robot employs [artificial neural networks] to manage the low-level functions based on the visual input it receives and classical AI to serve as a supervisory mechanism. So far, this robot can solve puzzles by itself and other complex tasks with no additional programming. The project has been so successful that the European Union decided to fund 13 additional projects based on these results.
Adaptive systems which learn about their environment in a similar way to a toddler exploring its surroundings could form the heart of flexible robots, road traffic monitoring and surveillance systems. A European project called COgnitiveSystems using Perception-Action Learning (COSPAL) led by Sweden's Linköping University, is developing learning systems to make automated applications more flexible and adaptable.
- Quoted from 'Early learning centre' (© 2008 The Engineer)
Language-learning techniques designed for children are being used in a bid to break new ground by developing algorithms that enable robots to learn and understand concepts.
As part of the project by Plymouth University researchers, two robots will be built featuring software that allows them to interact with each other to exchange learned information like humans.
The work at Plymouth is the latest in a number of ambitious initiatives to apply human learning processes to robotic systems. Earlier this year (7 April) The Engineer reported how the European COSPAL project plans to help robots learn to carry out actions. The Plymouth project, meanwhile, concentrates on word meaning.
Tony Belpaeme at Plymouth said: 'Robots still don't know the meaning of things. The only techniques we have at the moment are using mathematical tricks and statistics to produce more or less sensible replies.
Robotic technology is advancing apace. Now, European researchers have developed a new breed of cognitive robot which they claim is lot like a puppy. Combining two approaches to building robots that can think for themselves - the classical rule-based artificial intelligence (AI) and the artificial neural networks (ANN) - they have developed the robots.
- The Austrian magazine "Forschen & Entdecken" published a report (© Forschen & Entdecken 2007) on the project Robots@home (, 632KB).
Project SENSOPAC: Sensorimotor structuring of Perception and Action for emergent Cognition
The race to create more human-like robots stepped up a gear this week as scientists in Spain set about building an artificial cerebellum. The end-game of the two-year project is to implant the man-made cerebellum in a robot to make movements and interaction with humans more natural.
The cerebellum is the part of the brain that controls motor functions. Researchers hope that the work might also yield clues to treat cognitive diseases such as Parkinson's. The research, being undertaken at the Department of Architecture and Computing Technology at the University of Granada, is part of a wider European project dubbed Sensopac.
- Quoted from 'Move to create less clumsy robots' (© BBC 2007)
Movies have convinced people that humanoid robots are realistic, but European researchers working on a path-breaking project knows all too well how difficult it is to build robots with even basic human abilities.
“Hollywood did a bad job for us,” says Patrick van der Smagt, the coordinator of Sensopac, an EU-funded project whose goal is to create a robotic arm, hand and brain with human-like capabilities.
“Existing robots, such as those that help assemble cars or computers, can perform repetitive actions quickly and precisely. However, they are not very intelligent or flexible and they don’t do very much sensing,” says van der Smagt.
European researchers have created a robotic hand that mimics the flexibility and sensitivity of a human hand, and is controlled by a neural-network-based program modelled on the cerebellum.
The hand developed as part of the research project Sensopac — being run by 12 groups — can grasp an egg, snap its fingers, and carry coffee.
Experts at the German Aerospace Centre (DLR) have revealed that they made a robotic "skin" out of a thin, flexible carbon that changes its resistance depending on pressure.
Project SPARK: Spatial-temporal patterns for action-oriented perception in roving robots
On 26 May 2006, TG Leonardo, an Italian TV programme, in the transmission “Science TG,” showed the first robot prototype of the project in action, followed by an explanation of the European network and know-how.
The natural world has inspired European researchers to develop intelligent, autonomous robots that not only move like insects, but also react like them.
As part of the EU-funded spatial-temporal array computer-based structure (SPARK) project, researchers created three robots to demonstrate their electronic and mechanical innovations — one featuring the complex 'brain' algorithms and two robots housing complicated control processing systems.
'The idea was to have a computational model of an insect brain embedded into a simple wheeled robot, and the other robots have six legs to mimic insect architectures,' said Prof Paolo Arena, project co-ordinator at the University of Catania in Italy.
- Quoted from 'Insects SPARK robot idea' (© The Engineer 2008)
An EU-funded research project into intelligent robots has brought commercial benefit to Budapest-based Analogic Computers, a partner in the project.
The three-year, 1.27M euro ($2M) spatial-temporal array computer-based structure (SPARK) project developed a new architecture for artificial cognitive systems that could help robots react to changing environmental conditions. Analogic, which specializes in image processing, developed algorithms for SPARK based on human vision.
Project COSY: Cognitive Systems for Cognitive Assistants
Researchers in artificial intelligence (AI) at the University of Birmingham are participating in a €6.25m, four-year European project to develop a cognitive robot. One of the project's aims is to help throw some light on human cognition. The plan is to take the various AI systems that have so far been realised in some form or other ('natural language' systems that process human voice inputs and can use bits of our grammar and machine vision) and create a robot that combines those cognitive abilities.
- Quoted from 'Birmingham in €6m AI project' by Harry Yeates (© 2005 Reed Business Information Limited)
Drilling is but one of hundreds of tasks where requirements can be conveyed to the dexterous robot system by humans selecting the surface points," Skaar said. Skaar's appealingly practical approach to the robot vision problem appears to be less ambitious than other programs such as the European Union's Cognitive Systems for Cognitive Assistants. This program, also known as CoSy, seeks ways of raising the intelligence of robots from their current insect-like level to that of a preschooler.
- Quoted from 'Eyes of Mars' by Paul Sharke (© 2004 The American Society of Mechanical Engineers)
- The article in Hungarian 'Robotokkal az emberi megismerés nyomában' (© 2004 AITIA International Rt)
- The article in Czech 'Inteligentní lednička' (© 2004 eF-Futurologie)
- The article in Russian (© 2004 Конкурс Русских Инноваций)
Sloman and fellow researchers around Europe are primed to take the next step in the ongoing search for more intelligent robots, thanks to a grant of €6.25m (£4.3m) from the European Union. One of his colleagues in Birmingham, Dr Jeremy Wyatt, explains: "We think experiments so far have had limited objectives. What has not been done is to put together, in a working robot, the things that humans can do - seeing, manipulating, hearing, learning and answering questions.
- Quoted from 'Welcome to the next generation of robots' by Chris Arnot (© Guardian Newspapers Limited 2006)
"We've succeeded to an extent in getting the robot to understand in a very simple way the references to the objects, in terms of their type, some of their properties - simple things like colour," said Dr Jeremy Wyatt. "It's about talking about objects, being able to make spatial references to them, being able to understand spatial references, and essentially linking the ability to perceive an object by vision and being able to understand an utterance about the object." The aim of the four-year Cognitive Systems for Cognitive Assistants (CoSY) project is to combine natural language processing with vision, adding 'attention' to the robot's behaviour.
- Quoted from 'Robot capable of identifying objects by simple properties' by Harry Yeates in ElectronicsWeekly.com (© 2006 Reed Business Information Limited 2006)
European researchers are making progress on piecing together a new generation of machines that are more aware of their environment and better able to interact with humans. While building robots with anything akin to human intelligence remains a far-off vision, making them more responsive would allow them to be used in a greater variety of sophisticated tasks in the manufacturing and service sectors. Such robots could also be used as home helpers and caregivers, for example.
Robotic engineers have long been trying to develop robots near to having human intelligence, and now European researchers are working towards piecing together a new generation of machines that are more aware of their environment and better able to interact with humans. This may enable scientists to make robots more responsive and allow them to be used for a number of sophisticated tasks in the manufacturing and service sectors or as home-helpers and caregivers.
- Quoted from 'Scientists putting together the next generation of cognitive robots' (© Yahoo India 2008)
Project PHRIENDS: Physical Human-Robot-Interaction - Dependability and Safety
The Italian daily newspaper La Repubblica published an article on PHRIENDS (© 2008 La Repubblica).
Project ROBOT-CUB: Robotic Open-architecture Technology for Cognition, Understanding and Behaviours
One approach that robotics researchers are using to improve the intelligence of their robots is to give them the capacity to learn. For example, learning by experience is at the heart of an international collaboration called RobotCub, which is funded by the European Union and involves 16 labs from Europe, Japan and the US. The idea of the project is to build over the next five years a child-sized humanoid robot that will get smarter as it learns by interacting with its environment, just as human children do. One research team involved in the project is led by Kerstin Dautenhahn from the University of Hertfordshire in the UK. She feels that much more needs to be done right now in terms of understanding how people respond to having robots around. "The field of human-robot interaction is in its infancy, there is still a lot of groundwork to be done," she says.
- Quoted from 'Men are from Mars, robots are from Mitsubishi' by Stephen Pinchcock (© 2005 The Financial Times Ltd)
- The article in German 'Humanoide wird Versuchstier der Kognitionsforschung by Michael Vogel' (© 2005 Konradin IT-Verlag GmbH)
- The article in Polish 'Od manipulacji do inteligencji' by Grzegorz Wieczerzak' (© 2005 Gazeta IT)
- The article in German 'ASIMO tifft Mitglieder des Europaparlaments' (© 2005 Honda)
Robots, like children, will soon learn best from their own experiences, according to a team of EU scientists working on a new robot platform.
The team behind the EU-funded RobotCub project, which designed the iCub robot, discovered that teaching robots to understand enough to act independently is more difficult than initially believed.
The grand green Apennine Mountains fill the windows at the University of Genoa's Laboratory for Integrated Advanced Robotics, but otherwise it isn't that different from the other labs: As Europe's preeminent robotics facility and one of the world's epicenters of artificial intelligence research, it's dominated by eggheads staring at monitors. And, of course, there's an android hanging around the place.
The size and shape of a 3-year-old, RobotCub has two five-fingered hands, each of which will be covered with sensitive artificial skin made of the same stuff as the iPod's electrostatic touchwheel. It has expressive eyes, a white plastic shell that makes it look like Casper the Friendly Ghost, and a tether that runs from its back like an electronic umbilical cord into an adjacent room, where it connects to a few dozen PCs.
Researchers across Europe are becoming parents to bouncing baby robots. By teaching them to walk, open doors, shake hands, and even talk, scientists hope to figure out how human children learn to do the things we adults take for granted.
The team behind the iCub robot believes that robots, like children, learn best from experience. Like a toddler who progressively learns about his own motor skills and how to interact with the world, the iCub—the size of a 3-year-old child, with sensor-equipped hands, eyes, and ears—has touch, sight, and hearing to explore its surroundings and develop its cognitive abilities.
The iCub is the baby of RobotCub, the European Union–funded project that aims to advance research on the use of humanoid robots to understand human learning. Scientists in Europe and beyond believe humanoids can be essential tools in the study of human intelligence, which many of them argue is linked to the structure of the human body and the way it can interact with its surroundings.
- Quoted from 'Open-Source Baby' (© IEEE Spectrum Online 2008)
Researchers from across Europe are being trained in Genoa, Italy, this month in advance of taking possession of their very own iCub: a robot designed to have the physical and sensory capabilities of a two-and-a-half year old child.
For the researchers involved, one crucial characteristic of the new robot is that both the hardware and software are open-source and designed for easy collaboration. Whether the researchers build better cognitive architectures, learning algorithms, sensors or limbs, once their work has been proved on the European Commission-funded iCub, it can be shared and used to improve the next generation of machines.
Project ITALK: Integration and Transfer of Action and Language Knowledge in Robots
Over the next four years robotics experts will work with language development specialists who research how parents teach children to speak. Their findings could lead to the development of humanoid robots which learn, think and talk. The project is believed to be the first of its kind in the world and typical experiments with the iCub robot will include activities such as inserting objects of various shapes into the corresponding holes in a box, serialising nested cups and stacking wooden blocks.
- Quoted from 'Plan to teach baby robot to talk' (© 2008 BBC)
- Another English article on this project: 'Scientists to teach 'toddlerbot' to speak' (© 2008 The Daily Telegraph)
- A German article on this project: 'Humanoider Roboter soll wie ein Kind Sprache lernen' (© 2008 Heise Verlag)
Project COGVIS: Cognitive Vision Systems
An example of FP5 project CogVis feeding back into the cognitive science community - by providing an annotated image database to FP6 project PASCAL, which is now using it as part of its "challenge" to scientists to come up with ways to classify automatically visual objects.
- See 'Combined Object Categorization and Segmentation with an Implicit Shape Model' by Bastian Leibe, Ales Leonardis, Bernt Schiele (ECCV 2004 Workshop on Statistical Learning in Computer Vision)