Skip to main content
European Commission logo print header

Programme Category

Programme

Article available in the following languages:

EN

Big data PPP: research addressing main technology challenges of the data economy

 

Research and innovation actions are expected to address cross-sector and cross-border problems or opportunities of clear industrial significance.

These will include (but are not limited to):

  • Software stacks designed to help programmers and big data practitioners take advantage of novel architectures in order to optimise Big Data processing tasks;
  • Distributed data and process mining, predictive analytics and visualization at the service of industrial decision support processes;
  • Real-time complex event processing over extremely large numbers of high volume streams of possibly noisy, possibly incomplete data.

All human factors claims (e.g. usability, maintainability) concerning software to be developed will need to be rigorously tested by methodologically sound experiments with clear plans to recruit adequate numbers of experimental subjects of the required type (e.g. professional experts as opposed to researchers or software developers). Proposals must demonstrate that they have access to appropriately large, complex and realistic data sets. Proposals are expected to make best possible use of large volumes of diverse corporate data as well as, where appropriate, open data from the European Union Open Data portal and/or other European open data sources, including data coming from EU initiatives like Copernicus and Galileo. Proposals should make appropriate use of and/or contribute to data exchange and interoperability standards.

The Commission considers that proposals requesting a contribution from the EU of between EUR 2 and 5 million would allow this specific challenge to be addressed appropriately. Nonetheless, this does not preclude submission and selection of proposals requesting other amounts.

Significant opportunities for value generation from (Big) Data assets are lost because the available software and IT architecture solutions are not adapted to the processing, analysis and visualisation of data in a situation where the volume, velocity and variety of the data are increasing rapidly. The challenge is to fundamentally improve the technology, methods, standards and processes, building on a solid scientific basis, and responding to real needs.

  • Powerful (Big) Data processing tools and methods that demonstrate their applicability in real-world settings, including the data experimentation/integration (ICT-14) and Large Scale Pilot (ICT-15) projects;
  • Demonstrated, significant increase of speed of data throughput and access, , as measured against relevant, industry-validated benchmarks;
  • Substantial increase in the definition and uptake of standards fostering data sharing, exchange and interoperability.