Skip to main content
Ir a la página de inicio de la Comisión Europea (se abrirá en una nueva ventana)
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
Contenido archivado el 2024-06-18

Identifying best practices for successful facilitation of science learning through general interest television programming

Final Report Summary - SCIENCE ON TV (Identifying best practices for successful facilitation of science learning through general interest television programming)

Project context and objectives

Although much effort is being invested in science communication training, the efforts are rarely accompanied by systematic evaluation of learning outcomes; the existing evaluations are mainly anecdotal, or specific to a particular programme. Standardised assessments will allow comparisons across programmes and identification of best practices.

This project developed and piloted a tool for measuring scientists' views about knowledge of and skills in science communication. The instrument collects four types of data:

- background information;
- written communication skills;
- views about science communication;
- knowledge about the context of science communication.

The resulting instrument may be used as a baseline survey or as a tool for pre-post evaluation of the learning outcomes of a wide range of science communication training programmes and courses.

Several studies have described motivations and challenges for scientists who wish to speak with the media, as well as the abundance of such interactions. However, few studies have systematically examined scientists' ability to communicate with the media and the public. Nonetheless, many organisations and institutions have created training opportunities to help scientists become better at public communication. Moreover, science communication scholars are largely in agreement that bench scientists and engineers, as well as science and health regulators, would benefit from both media training and training in communicating with the public.

In the field of education, in which I was trained, learning goals are often developed based on conceptual frames. I could not find any conceptually based list of specific learning goals in the area of media training for scientists. There are abundant numbers of books on practical advice for scientists, which give ideas for what scientists interacting with the media and public should be doing. These books, practical and useful as they may be, are not conceptually grounded and do not lend themselves to empirical evaluation. As a researcher concerned with the issue of science communication, and frequently involved in working with scientists who wish to improve their communication with non-scientific audiences, I saw the need to establish a more coherent conceptual framework for measuring scientists' communication abilities and, through that framework, measuring the success of science communication education programmes. Currently, claims for the efficacy of such training programmes are often based on anecdotes and basic self-report evaluations. This data does not allow for comparisons between interventions, nor does it provide material for evidence-based planning or policy regarding media training for scientists.

Work performed

Since no list of learning objectives for science communication teaching existed, my task was twofold: first, to compile such a list of potential learning goals, and then to derive a conceptually based scheme for evaluating whether they have been achieved.

This project developed such a measurement for scientists' views and knowledge of and skills in the public communication of science, especially in the context of science communication education programmes. Broad learning areas were identified, and items that might measure those areas were developed and tested. The development of the instrument was guided by existing literature, interviews with active scientists in order to establish face validity, as well as to establish test/retest reliability. The instrument was piloted in various contexts: using two languages, during semester-long courses, one-day and two-day interventions with undergraduate and graduate science and engineering students, as well as with practising scientists.

Main results

The main outcome of this project is the assessment tool and the analytical framework for understanding the results it produces. This includes the identification or creation of standard measures, so that assessment of learning can be compared across settings. Standard measures are important because they move beyond self-reports, and because they have been validated in multiple settings. The assessment tool may be used as a baseline survey with a representative sample of a particular population or as a tool for pre-post evaluation of the learning outcomes of a wide range of science communication training programmes and courses.

It provides a detailed description of written skills analysis, drawn from a variety of content and assessment traditions, which also includes novel measurements such as a jargon index. The analysis performed pointed to the importance of assessing actual performance, rather than declared knowledge. It suggested potential areas for further investigation, such as the frequent mismatch between what scientists say they know about science communication and what they actually do; the limited use of narrative, analogy and metaphor despite their frequent presence in books on practical advice; and the very high presence of content while simultaneously failing to present the nature of science or the connections to everyday life.

Conclusions

If learning the discourse of science is essential to becoming a scientist, learning the discourse of public communication of science is essential for scientists engaging with the public. For this purpose, scientists may need to 'unlearn' the communication skills they have acquired as scientists. This process may take place in sociocultural environments that value such practices. Unfortunately, the two discourses sometimes work against each other: one rewards jargon, the other penalises it; one rewards precision, the other accepts approximation; one rewards quantification, the other rewards story-telling and anecdotes. Using some of the criteria developed in this project, it may be possible to document changes in the scientists' ability to develop their messages in ways that will reward them in the public arena, thus promoting informed and meaningful engagement with the public on science.

Socio-economic impact of the project

This research and its outcomes are relevant to researchers and practitioners (science journalists, media trainers, information officers) within the science communication community, as well as to scientists, government agencies and educational organisations who are involved in promoting public engagement with science.
Mi folleto 0 0