Project description
Cognitive Systems and Robotics
understanding and exploiting the meaning (semantics) of manipulations in terms of objects, actions and their consequences for reproducing human actions with machines
IntellAct addresses the problem of understanding and exploiting the meaning (semantics) of manipulations in terms of objects, actions and their consequences for reproducing human actions with machines. This is in particular required for the interaction between humans and robots in which the robot has to understand the human action and then to transfer it to its own embodiment. IntellAct will provide means to allow for this transfer not by copying movements of the human but by transferring the human action on a semantic level. Two major application areas are addressed by IntellAct: the monitoring of human manipulations for correctness and the efficient teaching of cognitive robots to perform manipulations in a wide variety of applications.
IntellAct addresses the problem of understanding and exploiting the meaning (semantics) of manipulations in terms of objects, actions and their consequences for reproducing human actions with machines. This is in particular required for the interaction between humans and robots in which the robot has to understand the human action and then to transfer it to its own embodiment. IntellAct will provide means to allow for this transfer not by copying movements of the human but by transferring the human action on a semantic level. IntellAct will demonstrate the ability to understand scene and action semantics and to execute actions with a robot in two domains. First, in a laboratory environment (exemplified by a lab in the International Space Station (ISS)) and second, in an assembly process in an industrial context.IntellAct consists of three building blocks: (1) Learning: Abstract, semantic descriptions of manipulations are extracted from video sequences showing a human demonstrating the manipulations; (2) Monitoring: In the second step, observed manipulations are evaluated against the learned, semantic models; (3) Execution: Based on learned, semantic models, equivalent manipulations are executed by a robot.
The analysis of low-level observation data for semantic content (Learning) and the synthesis of concrete behaviour (Execution) constitute the major scientific challenge of IntellAct.Based on the semantic interpretation and description and enhanced with low-level trajectory data for grounding, two major application areas are addressed by IntellAct: First, the monitoring of human manipulations for correctness (e.g. for training or in high-risk scenarios) and second, the efficient teaching of cognitive robots to perform manipulations in a wide variety of applications.
To achieve these goals, IntellAct brings together recent methods for (1) parsing scenes into spatio-temporal graphs and so-called „semantic Event Chains‟, (2) probabilistic models of objects and their manipulation, (3) probabilistic rule learning, and (4) dynamic motion primitives for trainable and flexible descriptions of robotic motor behaviour. Its implementation employs a concurrent-engineering approach that includes virtual-reality-enhanced simulation as well as physical robots. Its goal culminates in the demonstration of a robot understanding, monitoring and reproducing human action.
Fields of science (EuroSciVoc)
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques. See: https://op.europa.eu/en/web/eu-vocabularies/euroscivoc.
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques. See: https://op.europa.eu/en/web/eu-vocabularies/euroscivoc.
- engineering and technology mechanical engineering vehicle engineering aerospace engineering astronautical engineering spacecraft
- engineering and technology electrical engineering, electronic engineering, information engineering electronic engineering robotics cognitive robots
You need to log in or register to use this function
We are sorry... an unexpected error occurred during execution.
You need to be authenticated. Your session might have expired.
Thank you for your feedback. You will soon receive an email to confirm the submission. If you have selected to be notified about the reporting status, you will also be contacted when the reporting status will change.
Programme(s)
Multi-annual funding programmes that define the EU’s priorities for research and innovation.
Multi-annual funding programmes that define the EU’s priorities for research and innovation.
Topic(s)
Calls for proposals are divided into topics. A topic defines a specific subject or area for which applicants can submit proposals. The description of a topic comprises its specific scope and the expected impact of the funded project.
Calls for proposals are divided into topics. A topic defines a specific subject or area for which applicants can submit proposals. The description of a topic comprises its specific scope and the expected impact of the funded project.
Call for proposal
Procedure for inviting applicants to submit project proposals, with the aim of receiving EU funding.
Procedure for inviting applicants to submit project proposals, with the aim of receiving EU funding.
FP7-ICT-2009-6
See other projects for this call
Funding Scheme
Funding scheme (or “Type of Action”) inside a programme with common features. It specifies: the scope of what is funded; the reimbursement rate; specific evaluation criteria to qualify for funding; and the use of simplified forms of costs like lump sums.
Funding scheme (or “Type of Action”) inside a programme with common features. It specifies: the scope of what is funded; the reimbursement rate; specific evaluation criteria to qualify for funding; and the use of simplified forms of costs like lump sums.
Coordinator
5230 Odense M
Denmark
The total costs incurred by this organisation to participate in the project, including direct and indirect costs. This amount is a subset of the overall project budget.