Community Research and Development Information Service - CORDIS

Investigating research - how the Framework Programme works - part 2

Once the policy, the scope and the specific areas for research have been defined, the Commission's role continues. It must timetable calls and start to assess incoming submissions.

'Work programmes are updated on an annual basis. We start with detailed descriptions in some...
Investigating research - how the Framework Programme works - part 2
Once the policy, the scope and the specific areas for research have been defined, the Commission's role continues. It must timetable calls and start to assess incoming submissions.

'Work programmes are updated on an annual basis. We start with detailed descriptions in some areas, but not necessarily in all others - calls are staggered. Programmes have clear objectives to deliver. We are obliged to cover all the research objectives. But we have a limited annual budget, so some areas have to be left for future years,' says Graham Stroud, Head of Unit for the Support for the Implementation of the Research Programmes.

All eligible proposals must be evaluated, which influences when calls can be issued for practical reasons. 'We have a building dedicated just to the evaluation of proposals, as we receive over 16,000 per year on average. Each proposal is evaluated by a minimum of three outside experts,' he says.

Large calls may need 600 or more experts evaluating proposals, 'Even the electronic submission system has a physical component, and we cannot risk the system breaking down by closing too many calls at the same time. Of course, people always submit at the last moment, even though the system allows repeated submissions of new versions of the same proposal,' he says. Mr Stroud believes that electronic submissions have boosted efficiency significantly, and are popular, rising from almost none three and a half years ago to just short of 100 per cent today. Only some older calls still allow paper submissions, but the Seventh Framework Programme (FP7) will be e-only.

'Electronic submissions improve matters in two ways: We can enforce quality control - submissions must have financial tables that add up, then there is the forced filling-in of certain fields. Human error always generates mistakes, though. Secondly, it is not reliant on random factors, such as post or delivery services.' Before electronic submissions, a person would stamp each individual application with the arrival time, given by an atomic clock, so there could be no disputes over whether submissions did indeed arrive on time or not.

When proposals have arrived, they are screened for eligibility. 'This is simple - they must contain parts A and B (A for administration, B for scientific content), arrive on time and have at least the minimum number of eligible participants,' he says.

Only around 20 per cent of eligible applications will go on to receive funding. Three or more experts from the Commission's database of experts are chosen to evaluate each proposal. The Commission is continuously trying to improve and expand its database of experts. People can propose themselves to be on the list, and there are calls to attract experts. Research organisations are encouraged to put names forward. The database currently holds some 50,000 names, and is available to all Member States.

'Evaluating proposals is a serious process,' says Mr Stroud. 'We have to choose the best people. There are some targets for experts - for example 40 per cent women, but this is difficult to achieve in a few key areas, such as nuclear research or engineering. The problem is that good women researchers are already over-booked as most funding agencies have targets for choosing women experts.'

Experts, too, must meet minimum requirements: expertise, experience, no conflict of interest, and the right background. 'Some programmes use IT tools to show up obvious conflicts of interest,' explains Mr Stroud. 'But this is a complicated area, and requires much checking on the part of the Commission and honesty on the part of the experts. Experts must sign a conflict of interest declaration and this is part of their contract. They are not allowed to work without this.'

Each proposal is considered on its merits, and awarded marks out of five for a series of criteria. To go forward, proposals must meet a threshold mark for each criterion, usually a minimum of 3 out of 5, except for technical quality, which rises to 4 out of 5.

Proposals are marked for:

- relevance to work programme;
- quality of science - the most important single criterion;
- likely impact of project;
- management of project - especially important due to the EU component - projects have on average 10 partners in seven states; this requires active management;
- resources - are they adequate/reasonable? This may also influence funding.

The Commission's role is of informed neutrality, mediating and moderating discussions, but not expressing views. 'We try to help the experts come to a consensus,' says Mr Stroud. 'For most, they are able to reach a consensus. If comments and marks are similar, then no further discussion is needed. However, if the experts cannot reach a consensus then extra experts are brought in. If there is still no common view, then minority views are recorded,' he says.

Next, panels of 10 or more evaluators perform quality control and rank the proposals in order. The panel must also separate proposals with the same marks and put them in order of preference. The panels look critically at the work of the consensus groups. For example, does a particular group give low marks because their marking style is harsh or because the proposals they looked at are of a lower quality?

Applicants for large proposals, such as Integrated Projects or Networks of Excellence, are often invited to a 'hearing'. Questions are sent to the applicants, and written replies can be sent, but most people opt to come and answer them in person. Applicants coming through the hearings are often impressed by the process. 'While unsuccessful applicants quite naturally often complain about the results of the evaluation, those who come to the hearings almost never complain, even if the proposal is finally not accepted,' says Mr Stroud.

'At the end, we have an evaluation report, made by the experts for the Commission, with a list of proposals in the priority order suggested by the experts,' says Mr Stroud. 'An Evaluation Summary Report is prepared for each proposal. These contain technical comments and criticisms and suggestions for improving the project; these are sent back to the proposers before any decisions are made, regardless of whether the proposal wins funding, and it neither constitutes an offer nor a rejection.'

The Evaluation Summary Report for each proposal, including scores and comments, is handed over to the Commission in ranked order. It is the Commission's job to take the evaluators' recommendations and turn them into a final list of approved proposals. However, it is very rare for the Commission to go against the advice of the experts - it follows this advice in 'probably 99 per cent of cases'. The Commission will probably not follow the experts' list only when there are clusters of highly-rated proposals covering the same area, or where similar projects are already receiving funding.

Naturally, the evaluations are also evaluated, and expert evaluators quizzed for their input. 'We appoint independent observers to be present at the evaluation sessions and report back to us on what they saw. These are not necessarily people with expertise in the particular call being evaluated - we want their views on the process, not on the proposals,' says Mr Stroud.

Some critics have said that the evaluation procedure has too many academic evaluators, and not enough industrialists, who would have a more market-oriented view. 'Industrialists should propose themselves or other industrialists as evaluation experts. We are always short of industrial experts (and women),' counters Mr Stroud.

The evaluations are presented to the Programme Committee, along with observers' reports and statistical breakdowns. Two lists are presented: approved projects in ranked order; and projects rejected because they were either ineligible or below the minimum thresholds. 'If the Commission does go against the advice of the evaluators, the Commission's reasons come under very close scrutiny from the Programme Committee,' says Mr Stroud.

The Commission will then open negotiations with the best fit projects, 'and use the comments of the evaluators to try to improve the projects during negotiation,' according to Mr Stroud. When the Commission is ready to sign, the results of these negotiations are presented to the Programme Committee, which gives a formal opinion.

'It is a fallacy that horse-trading happens over project selection, because neither the committee, nor the Commission (by and large) are in a position to challenge the views of experts over a proposal. Also, while each Member State may have its own interests, each project involves a number of Member States and Associated States,' he says.

Once the contracts are signed, projects usually report back annually with a progress report and cost statements. 'At the end of the project, final payments are held back until the final report is approved. This has to be given, with publishable abstracts, even if the results are commercially sensitive. We do not insist on the publication of confidential results, but insist on a non-commercially sensitive abstract to publish,' says the Head of Unit.

Mr Stroud is adamant that research is there for the good of all, and findings should be used. 'We want exploitation of results,' he says. 'Proposers are required to exploit themselves or disseminate so that others can.'

Source: CORDIS News interview with Graham Stroud

Related information

Record Number: 26145 / Last updated on: 2006-08-04
Category: Interviews
Provider: EC