Skip to main content
European Commission logo print header

A research agenda on implementation research in chronic care

Final Report Summary - IMPLEMENT (A research agenda on implementation research in chronic care.)

Executive Summary:
Why the research is needed
We know more about what is effective for improving chronic care than we know about how to implement these effective practices and care models. We also are beginning to use methods which combine implementation with research so that we can more quickly improve care at the same time as learning about what works best, for whom, and where. The Research agenda presents a synthesis of what a diverse group of experts across the 28 European countries thought was the research most needed to speed and spread improvements to chronic care.
It will contribute to the future EU research agenda on this subject which also seeks to finance relevant rigorous and timely research and development. But it is also of immediate use to improvers and researchers across Europe for the insights and discussions about implementation, chronic care and improvement science and practice.

Methods
A systematic approach was used to gather and prioritise experts’ views across Europe. The method used a) experts from a range of stakeholder groups, b) who used a systematic and “iterative approach” to assess priorities, and c) allowed experts to revise their priorities after reflection, debate with each other on-line and feedback. A summary of the method is given in appendix to the Research Agenda (Øvretveit, J Bongers, I Bloemendaal, M Bloemendaal, R Nauta, C 2015: “A Research Agenda for Implementation Research into Improving Care for People with Chronic Illnesses”, LIME/MMC Karolinska Institutet, Stockholm. Summarised in Øvretveit, J Huijsman, R Bongers, I Bloemendaal M 2015: "Better implementation of improvements in chronic care, European experts’ views on future research and development", from project web site: www.eu-implement.eu/downloadthebook).

Main findings
Eighteen priority implementation subjects were discovered from an analysis of an initial round of interviews with 25 experts. These are listed in the report below and fully elaborated in the main report. This list was only a starting point for the development of the research agenda using the iterative method described above. This method enabled experts to agree on the top priorities for research into implementation of chronic care improvements as:
Rated 1st Adoption or “take-up”
Rated 2nd Measuring and evaluating implementation effectiveness
Rated 3rd Patient empowerment for implementation

Those rated as lowest priority of the 18 were:
Rated 16th Simulating implementation
Rated 17th Laws, regulations and standards
Rated 18th Scientific research methods for knowledge about implementation

Further analysis of experts views
The Research Agenda gives:
- The descriptions of each of the 18 subjects which experts scored and prioritised for implementation research, with examples;
- The distribution of the 389 experts by country (the top 3 were: The Netherlands, Spain and Romania); by EU region (West, South and East), and by the 8 stakeholder categories (most experts were from research- and from healthcare- practice groups));
- The prioritisation of the research subjects by the experts;
- Details of the degree of agreement between experts and other details;
- Discussion and observations by the research team.

Project Context and Objectives:
In all European countries, the incidence and prevalence is increasing of heart failure, diabetes and asthma and other chronic health conditions - which can also include cancers. These are increasingly common health conditions which are experienced by European citizens, which may not be curable, and may continue for some time, often for the rest of someone’s life. In addition to the human suffering caused, this is placing an increasing burden on families, health care services, businesses and tax-payers.

Many older people experience two or more long term illnesses and which makes care and support more complex (DHSS 2010). Clinical coordination is often unsatisfactory or even harmful, and becoming more difficult. In part this is because a greater variety of services are being offered and used by people with people with long term conditions (PwLTCs), with more outside of the health sector. In part it is because of constraints to data exchange, which hinders the potential for better access to needed information which more EHRs (electronic health records) and interconnectivity can provide.

Research has discovered interventions that are effective, and some which are cost-effective, for prevention, diagnosis, treatment, care coordination and self care – termed in this report “chronic care improvements”. By “chronic care improvements” we mean changes to support self care, and changes that improve the care provided to patients by practitioners, service organizations and close caregivers. These improvements apply to chronic care in all fields of health care, including home-care, general care, specialized and hospitalized care, and psychiatric care.
The “take up” or “adoption” of these improvements by care providers and others has been slow and variable, as well as the take-up of support for self-care by people with chronic illnesses. Some research exists into effective implementation strategies and methods, but this research is limited and often not known or used, even by those actively implementing improvements. The focus of the study reported here was on implementation of these improvements.

The purpose of the “IMPLEMENT” project was to discover which research is most needed in the future to speed and spread the “take up” of these improvements, and to establish a European network to carry out and apply the research.

The Research Agenda (Øvretveit et al 2015) gives the findings and analyses of the first part of the project, which was to seek “experts” views about the gaps in knowledge about implementation that are hindering the take up of these improvements. A second part of the project was to establish, in each EU country, a network to implement chronic care improvements, as well as a cross-European network to share experience. (Øvretveit, J Bongers, I Bloemendaal, M Bloemendaal, R Nauta, C 2015: "A Research Agenda for Implementation Research into Improving Care for People with Chronic Illnesses", LIME/MMC Karolinska Institutet, Stockholm. Summarised in Øvretveit, J Huijsman, R Bongers, I Bloemendaal M 2015: "Better implementation of improvements in chronic care, European experts’ views on future research and development", from project web site: www.eu-implement.eu/downloadthebook).


Project Results:
a. Assessment and Research Agenda
1. Introduction
This is the report on the assessment of the Research Themes that has been carried out for the IMPLEMENT project, in order to be able to produce a Research Agenda for the EU.
The aim of the assessment has been to first gather and subsequently to prioritize a range of experts’ opinions across the EU concerning themes for research into the implementation of innovations in chronic care, in order to present a Research Agenda to the EU.
This assessment has addressed an international Triple Helix expert population, expanded with Core Users of the innovations (both patients and professionals). Qualitative interviews and the ExpertLens method were used to gather and respectively prioritize the opinions of the experts.

2. Methodology
2.1 Grounded Theory

IMPLEMENT has assessed the knowledge concerning the implementation of innovations in Chronic Care, based on an approach underlying the whole methodology of IMPLEMENT: Grounded Theory.

Grounded Theory (Glaser and Strauss (1967)) is an overall methodology that makes use of ‘Sensitizing Concepts’. These concepts are – although based on knowledge and publications – without pre-defined certitudes. These concepts are filled and thereby tested by accumulating ‘living knowledge’ from experts in the concerning fields (by interviewing them). This means that the accumulation of knowledge is highly qualitative.

Grounded Theory distinguishes four phases:
1. Exploration, in which the sensitizing concepts are defined and filled by data and the outcome of qualitative interviews.
2. Specification, in which the concepts are refined or replaced by better fitting concepts.
3. Reduction, in which the concepts are being distilled to their essentials.
4. Integration, in which the core concepts are the basis for growth and deepening of the field of knowledge, using interviews.

Grounded Theory is very fitting for IMPLEMENT. It is a way to accumulate knowledge without having to make rigid assumptions in advance that afterwards are subjected to falsification. Instead, it is a model for tacit knowledge to grow and by testing become more explicit knowledge.

2.2 Expert population: Triple Helix Plus

The field of Implementation Science is still in its infancy. This means that much knowledge and expertise is tacit, and has not been translated into scientific studies and papers yet. Publications lag 1-2 year behind the actual state of knowledge. This has led to the decision of IMPLEMENT to focus on the Experts, instead of focussing on literature.

Next to that, the expertise regarding implementation in Chronic Care is not limited to the scientific world. On the contrary: much knowledge can be gathered from experts that work as policy makers, business people or professionals in healthcare. And let us also not forget the patients and their informal care givers (often relatives), whose knowledge is indispensable in making inventions work in daily practice. These non-academic experts both have manifest and tacit knowledge. Where the manifest knowledge may at some degree have found its way to scientific publications, the tacit knowledge only can be delved by speaking with the holder of the knowledge.

In order to reach all these experts and thereby to meet its goals, IMPLEMENT will ensure the connection to this diverse population from four categories:

1 | Triple Helix
The Triple Helix thesis is that the potential for innovation and economic development in a Knowledge Society lies in a more prominent role for the hybridization of elements from University, Industry and Government to generate new institutional and social formats for the production, transfer and application of knowledge (Etzkowitz &Leydesdorff, 1997). Following this reasoning it can be concluded that the experts are not only found in academic circles. It is inherent to implementation in healthcare that the government(s), healthcare organisations and industry are very important players as well. Moreover, especially in healthcare, the boundaries between science, industry and professional knowledge are not clear-cut: research can be found in all three fields. University focuses most on fundamental research, industry and professionals more on applied research. IMPLEMENT therefore adheres to the Triple Helix approach. This will not only lead to the desired amount of relevant knowledge, but also to a fertile exchange of different viewpoints. Finally, this approach also maximizes the usefulness of the results in practice.

2 | ‘Plus’: Core users
Additional to the input of the experts right from the start of the project, the other world of expertise should be explored: the patient and the professional at the bedside. In other words: the ‘end users’ of the innovation. Healthcare organizations also fall in this category.

Building the bridge between the two worlds of experts (‘Triple Helix’) and core users (‘Plus’) ensures that practical (tacit) knowledge is incorporated in the proposal. Doing so, defines the difference between failure and success in implementation terms.

Next to the Triple Helix Plus categories, we see that in practice people have a dual role. For example, we see healthcare professionals that execute research besides their daily job, or that execute managerial tasks for healthcare institutes.

We therefore have split the Triple Helix Plus categories into 8 categories:
1. Patient (Practice)
2. Professional (Practice)
3. Healthcare institute (Practice)
4. Government (Policy)
5. Industry (Business)
6. Research institute (Research)
7. Professional and research institute
8. Professional and a healthcare institute

Beneath, figure 1 depicts the Triple Helix Plus expert population.


Figure 1 | The Triple Helix Plus as applied in this assessment.


2.3 Gather Research Themes: qualitative interviews

An Expert Panel of 25 prominent experts in the field of implementation of chronic care interventions was formed. Please see appendix A for a list of the Expert Panel members. Through qualitative interviews, we discovered what the Expert Panel members considered to be the research most needed to speed and spread implementation of chronic care improvements.

The members of the Expert Panel have been approached by Purposive Sampling (Patton, 1990, 2002) and Snowballing (Goodman, 1961), in which the expertise in the Triple Helix Plus category was the primary focus over the country. This is because the envisioned high level of expertise of the Expert Panel members is considered largely independent from the country of origin, in regard of the basic themes the IMPLEMENT Project was looking for. Because the focus of this assessment is on identifying Research Themes, the expertise of the researchers and the professionals has been deemed most valuable.

After each interview, an important aspect of the method used was to assess the degree of saturation of the themes, i.e. whether new themes still were brought up. This saturation has as follows been reached for all categories in the following way:
- First, ‘research’ and ‘professionals’ experts have been added to the Expert Panel and interviewed until full saturation of the themes from their angle of expertise was reached.
- The ‘patients’, ‘industry’ and ‘government’ experts have been interviewed in a later stage, in order to assess whether they would come up with themes that were not yet identified by the other experts. This was not the case, which means that here also saturation has taken place.

In total, 74 themes have been gathered during the interviews.

Procedure

The following procedure has been applied:

1 | Open interviews
An open interview-approach was used. Expert Panel members were not asked to focus on research about implementation of a particular chronic care intervention, but on needed implementation research for a range of chronic care interventions.
The experts were asked to give an answer on the main research question ‘In your opinion, on which themes there is the greatest need to carry out new research, in order to speed up and improve the implementation of innovations in chronic care?’ Sub questions followed out of these were: what Research Themes are addressed by the experts; what is the central question of every theme; what is the explanation; what are the research questions supporting every Research Theme?
All interviews have been recorded.

2 | Transcription
The interviews were fully transcribed.

3 | Summarizing
The researcher (checked by another researcher) made a summary based on the transcription.

5 | Summary validation
Panel Members have been asked to check their input by commenting on the researchers’ summary of their input (member check).

6 | Open and Axial coding
The subjects and / or difficulties experts experienced in regard to implementation have been coded into a ‘pre-Research Theme’. In total 74 of these themes were gathered from the interviews with the Expert Panel.

7 | Selective coding
These 74 matching and overlapping pre-Research Themes were analysed, re-ordened and eventually clustered to 18 ‘meta’ Research Themes, further on referred to as the Research Themes.

Please see appendix A for a list of the Expert Panel members.

2.4 Prioritize Research Themes: ExpertLens method

In total, 496 experts throughout the EU were invited to fill in the questionnaire. Starting from references by the Expert Panel members they were found through the Snowballing method. A research population of 389 experts has through the internet given written input to IMPLEMENT.

Collecting and processing the data has been done by deploying the ExpertLens method. ExpertLens (Dalal, Khodyakov, Srinivasan, Straus, Adams, 2011) is a recently developed and validated online elicitation method, aiming at closing the gap between the practical needs of efficiency concerning time and finances, and the methodological challenges associated with eliciting opinions of large, diverse, and distributed groups. As such, it is an efficient alternative to methods like the Nominal Group Technique (NGT), Delphi and crowdsourcing.

To score the priority of Research Themes, a 5-point scale has been chosen (extremely high priority, high priority, low priority, extremely low priority and ‘no opinion’) (Krosnick, 1999). By providing a 5-point scale, respondents are ‘forced’ to indicate the priority of research according to their perspective. Alternatives have been evaluated and are found to be less suitable: Providing a 7-point scale has the possibility of not focusing enough (Matell and Jacoby, 1971). Providing a 3-point scale (high priority, ‘no opinion’, low priority) does not give a sufficient rank order on priority (more/ less urgent) – it only indicates a dichotomous ‘yes’ or ‘no’ (urgent/not urgent) (Cox, 1980).

By providing a ‘no opinion’-answer, biases in data results are limited. As the group of ExpertLens participants is large, it was deemed likely to find several respondents who are not familiar with a certain Research Theme. Instead of ‘guessing’ the priority of these Research Themes, an ‘escape’ is provided. So, as it was expected that not all participating respondents possess sufficient knowledge on all Research Themes, and to prevent that respondents are guessing the priority of Research Themes they have no sufficient knowledge on, the reliability of data results is enlarged by providing a ‘no opinion’-answer.

The ExpertLens questionnaire has been tested and evaluated amongst a small group of persons in order to get the questionnaire as clear as possible.

The ExpertLens method consists of three rounds:
• Round 1: preliminary ranking of the Research Themes by the experts
• Round 2: asynchronous and anonymous online discussion
• Round 3: final ranking by the experts

Please see appendix B-F for respectively the question tree (B), the invitation to the survey (C), the starting page of the online survey (D), an example of a Research Theme in the online survey (E) and a list of the participants to the ExpertLens survey (F). To get an impression of the online discussion in the blog round, see an extract of this discussion in appendix I.

2.5 Analyse the ExpertLens data

The online ExpertLens survey was filled in by 389 participants, see appendia G for the response rates to the three rounds and the analysis.
Analysis on the ExpertLens data was executed according to the following methods:

A | Descriptive statistics has been used to describe the background composition of the
expert group according to:
• Expertise: Triple Helix Plus perspective, clustered to Scientists and Non-scientists
• EU Country, clustered to European Areas
See appendix H for the number of participants per country.

B | The Research Themes have been ranked on priority (mean) and the degree of consensus
among the participants (variance). In case the mean for Research Themes were equal,
variance has been applied to refine the prioritization.

The ranking on priority uses the following rating on each Research Theme: a five point scale with the options: ‘Extremely high priority' (has been given the score +2), 'High priority'(score 1), ‘No opinion’(score 0 ), 'Low priority' (score -1), 'Extremely low priority' (score -2), ‘No answer’ were excluded.
Prioritization over the whole population was established by summing up the coding +2, +1, 0, -1, -2. Divided through the number of respondents, this gave a score (mean value) to this Research Theme.
See appendix J for a table with the mean and variance of the ranked Research Themes.
See appendix K for an overview of the scores per Research Theme.

C | Breakdowns of the results by European area and the Triple Helix Plus perspective of the
expert were used to deepen the prioritization of the Research Themes.

D | The normal distribution of data was tested by the Kolmogorov-Smirnov test. Results
indicate that the dependent variables are non-normally distributed.
We checked if the found results are significant, both for the total group as well as with
regard to the background variables (European Area, Triple Helix Plus category,
Scientist/Non-Scientist). All of these variables were found to be not significant. This is
concluded by the Kruskal-Wallis H test, which can be applied to non-normally
distributed data (see appendix M).

3 Results
3.1 Research Themes
From qualitative interviews with 25 Expert Panel members the 18 Research Themes of the IMPLEMENT survey were originated. These Research Themes were prioritized by 389 Experts, surveyed through the ExpertLens method.

It was concluded that the extracted Research Themes cover different fields in the chronic care. They could be clustered in the following groups (see also figure 2).
• General Implementation Issues
• Patients and participation issues
• Context issues
• Research issues.


Figure 2 | Research Themes clustered into main research fields

3.2 Prioritizing the Research Themes

3.2.1 Response rates of the ExpertLens Survey
The response rate of the IMPLEMENT ExpertLens Survey on the Research Agenda is 78,4%. Which, looking at the extended survey due to the necessary clarification of all themes and the limited spare time available to the participants, could be called extremely high. This of course also shows the enormous involvement of the experts in this field.

In total, 496 Experts were invited to fill in the questionnaire. Of these 496 invitees, 389 respondents (ExpertLens participants) (78,4%) filled in one or more questions in the IMPLEMENT Questionnaire. Of these 389 respondents:
• 226 (58%) used the blog round (online discussion forum) to discuss with other experts the priority given to the 18 Research Themes extracted in the first round
• 101 (26%) changed their opinion after the discussion.

The gathering of the respondents took place during two data collection periods.

For the analysis of the results of the survey we can include 389 respondents, see also figure 3.
• 163 who participated in round 1 only (including the ones who partially filled in the survey)
• 30 who participated in round 1 and 2
• 196 who participated in all three rounds.

Also the respondents who partially filled in the survey were included, in order to not let go of their opinion on individual themes. However, only the participants who filled in the survey 100% in round 1 were invited to the blog round (round 2).

For an extensive explanation on the response, please consult appendix G.


Figure 3 | Response available for analysis

3.2.2 Background characteristics of the ExpertLens participants

The research population consists of 389 respondents (ExpertLens participants, see appendix F) who have been found through the snowballing method. As the snowballing started from research, the research perspective is best filled. This also implicates that not all perspectives were equally successfully filled, which is especially the case for the perspectives of Patient, Healthcare, Government and Industry. Therefore, for these perspectives the results of this assessment have to be considered as indicative. In appendix H the number of participants per country can be found.

To receive a more homogenous picture of the prioritization by country, as for some countries are represented by one or few respondents only, we clustered all countries to three main European areas. The following figures, figure 4 and 5, show the distribution of ExpertLens participants by these European areas.


Figure 4 | Number of respondents by European Area

Figure 5 | Percentage of respondents by European Area


Figure 5 | Distribution of ExpertLens participants by their Triple Helix Plus perspective as well as by being classified as a scientist or not.


Figure 6 | Number of respondents per Expert Category (Triple Helix Plus and Scientist)

Respondents with a background in the professional and research institutional-Triple Helix Plus category represent a large share of the research population, together with the research institution-Triple Helix Plus category. The explanation can be found in the networks (often (medical) universities)) that have been asked and the snowballing method that has been used to gather ExpertLens participants.

The figures hereafter, show the distribution of the Triple Helix Plus categories (figures 7) and the attribution (figure 8) of the feature Scientist or Non-scientist over the respondent group.

Figure 7 | Percentage of respondents per Triple Helix Plus category




Figure 8 | Percentage of respondents Scientist or Non-scientist



Figure 9 | Breakdown of European Area and the Triple Helix Plus perspective of the experts combined


Figure 10 | European Area and the clustered Triple Helix Plus categories combined to Scientist or Non-scientist





Figure 11 | Diagrams of the breakdown per European area and Triple Helix Plus category

3.2.3 Rating of the Research Themes

Results on the priority of Research Themes are based on ExpertLens participants’ ranking on priority (mean). We used mean for prioritization of the Research Themes because the number of respondents is not equal for all themes. The variance shows the degree of consensus amongst the participants.
The ranking on priority consist of the following rating:
• 'Extremely high priority'='2'
• 'High priority'='1'
• 'Low priority'='-1'
• 'Extremely low priority'='-2'
• No opinion: 0
• No answer: excluded

Eighteen priority implementation subjects were discovered from an analysis of an initial round of interviews with 25 experts. These are listed in the report below and fully elaborated in the main report. This list was only a starting point for the development of the research agenda using the iterative method described above. This method enabled experts to agree on the top priorities for research into implementation of chronic care improvements as:
Rated 1st Adoption or “take-up”
Rated 2nd Measuring and evaluating implementation effectiveness
Rated 3rd Patient empowerment for implementation

Those rated as lowest priority of the 18 were:
Rated 16th Simulating implementation
Rated 17th Laws, regulations and standards
Rated 18th Scientific research methods for knowledge about implementation



Figure 12 | Rating of Research Themes by the ExpertLens participants, ranked from highest priority to least high priority (ranking is done by the mean)

Next to the mean value, also the variance value has been calculated. This value shows the degree of consensus between the respondents on the prioritization of the Research Theme.
The value has also been used to refine the prioritization in case of equal mean.

The following figure shows the strength in agreement for all Research Themes.


Figure 13 | Strength in agreement for all Research Themes, put in sequence by priority of the Themes.

The strength in agreement for the top 3 prioritized Research Themes (respectively ‘Adoption or ‘take-up’’, ‘Measuring and evaluation implementation effectiveness’ and ‘Patient empowerment for implementation’) have been marked with !in the graph.
The top 3 consensus on Research Themes is formed by the topics that have the lowest variance, or in other words: the Experts agree most on the priority of these Themes (respectively ‘Adoption or ‘take-up’’, ‘Skills, education and training’ and ‘Pathways’). In the graph they are marked with *.
In this way it is also possible to show the top 3 Research Themes with the least agreement. These topics have the highest variance, in other words: the Experts disagree relatively most on the priority of these Themes (respectively ‘Reimbursement’, ‘Partnership Research‘ and ‘Laws, regulations and standards’). They are marked with *.

b. IMPLEMENT EU Expert Network
The IMPLEMENT project will furthermore set up a network consisting of relevant stakeholders following our ‘Triple Helix plus’ categorization (research institutes, patient associations, industry, healthcare institutions, policy makers, professional associations). This network will during the IMPLEMENT project be involved in validating and adopting the Research Agenda. The network will be set up durably, in order to enable it to execute the Research Agenda after the project and translate the outcomes of the research to application in the Chronic Care field.

The IMPLEMENT Network consists of institutes and individuals in the Triple Helix plus domain.



Figure 14 | Triple Helix Plus expert categories


1. Network and assessment

The EU Network and the assessment on behalf of the Research Agenda interact in three ways.

a. The largest part of the EU Network consists of the experts that have contributed to the assessment, both as members of the Expert Panel and as members of both ExpertLens groups. They as such ‘feed’ the Research Agenda. In this phase of the project, membership is linked to filling in the assessment, in order to achieve a good population for the assessment, but also to only include members that are demonstrably active and willing to contribute to the whole. Also, since members will be able to comment on the concept Research Agenda (see b.), it is only fair that they also should have contributed to this Research Agenda.
b. The concept Research Agenda will be sent for comments to all Network members. Their feedback will be taken into account while producing the final version of the Research Agenda.
c. The Research Agenda will be disseminated within the EU Network and will as such be both the conclusion of the IMPLEMENT project, and the starting point of the EU Network as it is sustained after the project.

2. EU Network in detail

The IMPLEMENT EU Network consists of experts and expert institutes on the implementation of innovations in chronic care in all countries of the EU. The Network provides the respondents for the IMPLEMENT survey, comments on the concept Research Agenda and is the first body of experts amongst whom the final Research Agenda will be distributed.

Also, the IMPLEMENT EU Network will be the forum on which a ‘gap/knowledge market’ on implementation in chronic care will be organized. Every respondent of the IMPLEMENT survey will have indicated on which Research Theme he/she seeks knowledge or is willing to share it. This ‘force field’ will be combined with the Network Directory, thus enabling an exchange of knowledge throughout the EU.

2.1 Network members

The EU Expert Network consists of:

Consortium Members
As founders and executers of the IMPLEMENT Project, naturally the Consortium Members and their institutes/organisations will be part of the IMPLEMENT Network.

Expert Panel
25 Experts have been found willing to join the IMPLEMENT Expert Panel. They have been intensely interviewed in order to generate the Research Themes that are currently tested in the ExpertLens survey. As such they, via their opinions, are deeply embedded in the IMPLEMENT methodology. The Expert Panel also serves as a body for feedback, for instance on methodological matters. As such, the Expert Panel has turned out to be a body that renders the Advisory Board obsolete.

ExpertLens Groups
At this moment the ExpertLens survey Round 1 has taken place. More than 100 approved experts have filled in the survey, which qualifies them to join the Network after they have filled in their details in the Network Directory (mandatory). On behalf of the second Round of the ExpertLens survey, individual experts throughout Europe will be identified.




Figure 15 | Experts in the Network per country, as derived from the ExpertLens survey


Expert Institutes
Through the Expert Panel and ExpertLens members, the first Expert Institutes have been identified. This part of the network will be extended in the months to come, both by targeted research and as a spin off from the second Round of the ExpertLens survey. After the survey, recruiting members will go on.

Individual Experts
Any individual expert, who may or may not be connected to an expert institute, with expertise on the implementation of innovations in chronic care is welcome to join the Network. After the survey, recruiting members will go on.


2.2 Underlying structure


National Expert Centers
Originally named ‘National Contact Centers’, the National Expert Centers (NEC’s) are pivotal in forming the Network and in the sustainment of it after the IMPLEMENT project terminates. Their function is more than only ‘contact’; they need to be the core of the Expert Network. The name ‘National Expert Center’ reflects this function better. Next to this, the new name gives more status to the owner, which is an asset on its own (see also 7. Positive Energy Balance).

The NEC’s consist of institutes (or experts backed up by their institute) that are authorities in their field of work and in their networks. As such they will help in identifying and approach experts for the EU-broad survey that IMPLEMENT will organize, enrich the network afterwards, distribute the concept Research Agenda to ‘their’ experts and – optionally - preside discussions and forums organized for giving feedback on the concept Research Agenda and later on translating the final Research Agenda into calls for Horizon 2020 and other programs or implementation paragraphs within calls.

In doing so, strong national networks will arise around the NEC’s. Finally, the IMPLEMENT Network will connect with other networks like EIS and the UK Implementation Network (UK-IN), in which the NEC’s may act as ambassadors per country.


Triple Helix Plus
Both in the Expert Panel and in the ExpertLens group, all categories of the Triple Helix Plus are represented: Research, Healthcare institutes, Industry, Government (policy), Patients (Associations) and Professionals (Associations). These categories are also the underlying structure of the expert institutes and the individual experts in the Network. This ensures the validity of both the feedback on the concept Research Agenda and of the Network as a whole. Searching for experts will happen along the lines of this classification and also the experts themselves will be asked to classify themselves accordingly in the Network Directory.





Figure 16 | Current Experts in the Network per Triple Helix plus category, as derived from the ExpertLens survey.


Potential Impact:
4.2 Use and dissemination of foreground

Achmea was responsible for planning and carrying forward dissemination of the findings and future use of the findings through working package 5.
Section A
The following final results were delivered:
• D5.1 Website (http://eu-implement.eu/)
• D5.2 Project Update 1 & 2 & 3 & 4 (has been uploaded)
• D5.3 Publications, media exposure and presence in social media (has been uploaded)
• D5.4 International conference (took place on 31august/1 September 2015 in Amsterdam)
For the purpose of these end products, the following activities were carried out:
1. Internal communication
- Facilitate internal communication within the project team members;
- Dissemination of required information and planning
- Preparation of project updates;
- Implement Web site (wwweu-implement.eu) with blog to share information and knowledge.
o
2. Communication within the network
- Implement Web site wwweu-implement.eu (including registration options and opportunities to share information and knowledge.
- Overview of relevant upcoming events
- Project Updates and relevant information for distribution.

3. External communication
- Communication within social media and network Achmea
- Mailing to Implement-members
- Organization of international conference Implement; This took place in Amsterdam on September 1, 2015 with an international audience. At the conference, 65 People attended.
- Publication of final project results through social media (Skipr, linkedIn etc), website and direct mailing to Implement registrants + network Achmea.


Section B

Next to the above, a book has been produced which is a ‘glossified version’ of the Research Agenda, has a more bold ‘discussion’ paragraph and is as such specially made for broad publication purposes: Better imnplementation of improvements in Chronic Care – European Experts’ views on research and development’. A download link to the book (http://eu-implement.eu/downloadthebook/) has been forwarded to all members of the EU Expert Network and to relations of the IMPLEMENT consortium partners. A hard copy version of the book has been published for dissemination to decision makers.

IMPLEMENT uses this book to give the Research Agenda the ‘momentum’ it needs in order to be adopted by the target groups: researchers, healthcare professionals and last but not least: the EU. Because it is the EU that should take up these Research Themes now and translate them into policy.



Figure 17 | IMPLEMENT Book cover

List of Websites:
Scientific lead:
John Ovretveit (JØ), Director of Research and Professor of Health Innovation Implementation and Evaluation, LIME/MMC, Karolinska Institutet, Tomtebodavägen 18A, Stockholm 17177, Sweden
Tel: +46 8 524 836 00
E-mail: jovretbis@aol.com

Project manager:
Mark Bloemendaal, CEO, Implementation IQ, De Meern (the Netherlands)
e-mail: mark.bloemendaal@implementation-iq.nl

KI grant administrator: janet.jeppsson@ki.se

Project website. http://eu-implement.eu/