Project description
Collaborative machines recognising human actions
Today’s automated machines should be equipped with an ability of human recognition, especially in the case of human-robot interactions. Video and deep learning-based methods used today are not precise enough. However, they can benefit from the use of 3D clouds. For this reason, the EU-funded 3DInAction project will develop a new methodology and design a class of algorithms that are based on the global and local statistical properties of 3D point clouds. A 3D convolutional neural network will be used to create a multi-modal representation of human action for 3D human action recognition and ability to learn from the real world.
Objective
Human action recognition and forecasting is an integral part of autonomous robotic systems that require human-robot interaction as well as other engineering problems. Action recognition is typically achieved using video data and deep learning methods. However, other tasks, e.g. classification, showed that it is often beneficial to additionally use 3D data. Namely, 3D point clouds that are sampled on the surfaces of objects and agents in the scene. Unfortunately, existing human action recognition methods are somewhat limited, motivating the following research. In this action, we describe a new class of algorithms for 3D human action recognition and forecasting using a deep learning-based approach. Our approach is novel in that it extends a recent body of work on action recognition from 2D to the 3D domain which is particularly challenging due to the unstructured, unordered and permutation invariant nature of 3D point clouds. Our algorithms use the global and local statistical properties of 3D point clouds along with a 3D convolutional neural network to devise novel multi-modal representation of human action. It is inherently robust to spatial changes in the 3D domain, unlike previous works which rely on the 2D projections. In practice, deep learning methods allow us to learn an inference model from real-world examples. A common methodology for action recognition includes creating an annotated dataset, training an inference model and testing its generalization. Our research objectives cover all of these tasks and suggest novel methods to tackle them. Overall, the proposed research offers a new point of view for these long-standing problems, and with the vast related work in other domains, it may bridge the gap to arrive at a generalizable, effective and efficient 3D human action recognition and forecasting machinery. The resulting algorithms may be used in several scientific and engineering domains such as human-robot interaction among other applications.
Fields of science (EuroSciVoc)
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques. See: The European Science Vocabulary.
CORDIS classifies projects with EuroSciVoc, a multilingual taxonomy of fields of science, through a semi-automatic process based on NLP techniques. See: The European Science Vocabulary.
- natural sciences computer and information sciences software software applications simulation software
You need to log in or register to use this function
We are sorry... an unexpected error occurred during execution.
You need to be authenticated. Your session might have expired.
Thank you for your feedback. You will soon receive an email to confirm the submission. If you have selected to be notified about the reporting status, you will also be contacted when the reporting status will change.
Keywords
Project’s keywords as indicated by the project coordinator. Not to be confused with the EuroSciVoc taxonomy (Fields of science)
Project’s keywords as indicated by the project coordinator. Not to be confused with the EuroSciVoc taxonomy (Fields of science)
Programme(s)
Multi-annual funding programmes that define the EU’s priorities for research and innovation.
Multi-annual funding programmes that define the EU’s priorities for research and innovation.
-
H2020-EU.1.3. - EXCELLENT SCIENCE - Marie Skłodowska-Curie Actions
MAIN PROGRAMME
See all projects funded under this programme -
H2020-EU.1.3.2. - Nurturing excellence by means of cross-border and cross-sector mobility
See all projects funded under this programme
Topic(s)
Calls for proposals are divided into topics. A topic defines a specific subject or area for which applicants can submit proposals. The description of a topic comprises its specific scope and the expected impact of the funded project.
Calls for proposals are divided into topics. A topic defines a specific subject or area for which applicants can submit proposals. The description of a topic comprises its specific scope and the expected impact of the funded project.
Funding Scheme
Funding scheme (or “Type of Action”) inside a programme with common features. It specifies: the scope of what is funded; the reimbursement rate; specific evaluation criteria to qualify for funding; and the use of simplified forms of costs like lump sums.
Funding scheme (or “Type of Action”) inside a programme with common features. It specifies: the scope of what is funded; the reimbursement rate; specific evaluation criteria to qualify for funding; and the use of simplified forms of costs like lump sums.
MSCA-IF - Marie Skłodowska-Curie Individual Fellowships (IF)
See all projects funded under this funding scheme
Call for proposal
Procedure for inviting applicants to submit project proposals, with the aim of receiving EU funding.
Procedure for inviting applicants to submit project proposals, with the aim of receiving EU funding.
(opens in new window) H2020-MSCA-IF-2019
See all projects funded under this callCoordinator
Net EU financial contribution. The sum of money that the participant receives, deducted by the EU contribution to its linked third party. It considers the distribution of the EU financial contribution between direct beneficiaries of the project and other types of participants, like third-party participants.
32000 Haifa
Israel
The total costs incurred by this organisation to participate in the project, including direct and indirect costs. This amount is a subset of the overall project budget.