Servizio Comunitario di Informazione in materia di Ricerca e Sviluppo - CORDIS

FP7 STRANDS Logo

STRANDS

Project reference: 600623
Funded under

Spatio-Temporal Representations and Activities For Cognitive Control in Long-Term Scenarios

From 2013-04-01 to 2017-05-31, ongoing project

Project details

Total cost:

EUR 10 774 552

EU contribution:

EUR 8 234 543

Coordinated in:

United Kingdom

Call for proposal:

FP7-ICT-2011-9See other projects for this call

Funding scheme:

CP - Collaborative project (generic)
STRANDS aims to enable a robot to achieve robust and intelligent behaviour in human environments through exploitation of long-term experience. The approach is based on understanding 3D space and how it changes over time, from milliseconds to months. The project will develop control mechanisms which yield adaptive behaviour in highly demanding, real-world security and care scenarios. The robots will be able to run for significantly longer than current systems. Long runtimes provide previously unattainable opportunities for a robot to learn about its world. The society will benefit as robots become more capable of assisting humans, a necessary advance due to the demographic shifts in the health industry.

Objective

STRANDS aims to enable a robot to achieve robust and intelligent behaviour in human environments through adaptation to, and the exploitation of, long-term experience. Our approach is based on understanding 3D space and how it changes over time, from milliseconds to months. We will develop novel approaches to extract spatio-temporal structure from sensor data gathered during months of autonomous operation. Extracted structure will include reoccurring 3D shapes, objects, people, and models of activity. We will also develop control mechanisms which exploit these structures to yield adaptive behaviour in highly demanding, real-world security and care scenarios.<br/>The spatio-temporal dynamics presented by such scenarios (e.g. humans moving, furniture changing position, objects (re-)appearing) are largely treated as anomalous readings by state-of-the-art robots. Errors introduced by these readings accumulate over the lifetime of such systems, preventing many of them from running for more than a few hours. By autonomously modelling spatio-temporal dynamics, our robots will be able run for significantly longer than current systems (at least 120 days by the end of the project). Long runtimes provide previously unattainable opportunities for a robot to learn about its world. Our systems will take these opportunities, advancing long-term mapping, life-long learning about objects, person tracking, human activity recognition and self-motivated behaviour generation. The extraction of structure is key to this, as it both captures potential meaning, and also compresses a robot's sensor data into representations capable of storing months of experience in a manageable form.<br/>We will integrate our advances into complete cognitive systems to be deployed and evaluated at two end-user sites: a care home for the elderly in Austria, and an office environment patrolled by a security firm in the UK. The tasks these systems will perform are impossible without long-term adaptation to spatio-temporal dynamics, yet they are tasks demanded by early adopters of cognitive robots. We will measure our progress by benchmarking these systems against detailed user requirements and a range of objective criteria including measures of system runtime and autonomous behaviour.<br/>STRANDS will produce a wide variety for results, from software components to an evaluation of robot assistants for care staff. These results will benefit society in a range of ways: researchers will be able to access our results as open-access papers, software and data; our methodology for creating long-running robots will encourage roboticists to tackle this unsolved problem in our field; industrialists will see how cognitive robots can play a key role in their businesses, and access prototypes for their own use; and society will benefit as robots become more capable of assisting humans, a necessary advance due to, for example, the demographic shifts in the health industry.

Related information

Coordinator

THE UNIVERSITY OF BIRMINGHAM
United Kingdom

EU contribution: EUR 1 601 509


Edgbaston
B15 2TT BIRMINGHAM
United Kingdom
Administrative contact: May Chung
Tel.: +441214158202
Fax: +441214146056
E-mail

Participants

TECHNISCHE UNIVERSITAET WIEN
Austria

EU contribution: EUR 1 331 642


KARLSPLATZ
1040 WIEN
Austria
Administrative contact: Markus Vincze
Tel.: +431 58801376611
Fax: +431 5880137697
E-mail
AKADEMIE FUR ALTERSFORSCHUNG AM HAUS DER BARMHERZIGKEIT
Austria

EU contribution: EUR 523 958


SEEBOCKGASSE
1160 WIEN
Austria
Administrative contact: Christoph Gisinger
Tel.: +43 1401991112
E-mail
RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Germany

EU contribution: EUR 1 399 208


Templergraben
52062 AACHEN
Germany
Administrative contact: Ernst Schmachtenberg
Tel.: +49 241 8090490
Fax: +49 241 8092490
E-mail
KUNGLIGA TEKNISKA HOEGSKOLAN
Sweden

EU contribution: EUR 1 437 427


Valhallavaegen
10044 STOCKHOLM
Sweden
Administrative contact: Friné Portal
Tel.: +46 87909227
Fax: +46 87230302
E-mail
UNIVERSITY OF LEEDS
United Kingdom

EU contribution: EUR 1 049 541


WOODHOUSE LANE
LS2 9JT LEEDS
United Kingdom
Administrative contact: Martin Hamilton
Tel.: +44 113 3434090
Fax: +44 113 3430949
E-mail
G4S TECHNOLOGY LIMITED
United Kingdom

EU contribution: EUR 17 600


CHALLENGE HOUSE - INTERNATIONAL DRIVE
GL20 8UQ TEWKESBURY
United Kingdom
Administrative contact: David Ella
Tel.: +44 1684277032
E-mail
UNIVERSITY OF LINCOLN
United Kingdom

EU contribution: EUR 873 658


Brayford Pool
LN6 7TS LINCOLN
United Kingdom
Administrative contact: Carolyn Williams
Tel.: +44 1522 886642
E-mail
Record Number: 107156 / Last updated on: 2016-04-01