Skip to main content
Weiter zur Homepage der Europäischen Kommission (öffnet in neuem Fenster)
Deutsch Deutsch
CORDIS - Forschungsergebnisse der EU
CORDIS

E2-CREATE: Encoding Embodied CreativityVisual arts, performing arts, film, design

Periodic Reporting for period 1 - E2-CREATE (E2-CREATE: Encoding Embodied CreativityVisual arts, performing arts, film, design)

Berichtszeitraum: 2020-04-06 bis 2022-04-05

Issue being Addressed

Dance represents a rich resource of bodily expertise that is exciting and challenging for other scientific and artistic domains to draw from. E2-Create addresses this challenge by providing generative approaches to facilitate the exchange between dance and computer-based art. E2-Create places a strong focus on the combination of software development and artistic creation informed by recent progress in dance digitisation, machine learning (ML), and generative art.

Importance for Society

While dance plays an important role in society, its role as an artistic form of investigation that relies on embodied creativity remains under-appreciated. Furthermore, dance as a complex artform that is difficult to record constitutes a challenging, but rich domain for digital technology. Because of this, dance can contribute to future developments in digital technology, which in turn impacts society at large.
E2-Create highlights how tightly dance, as embodied creativity, and digital technology practices can be intertwined to achieve a high level mutual exchange between the two for research, development, and creation. This has the potential to increase public awareness for embodied forms of knowledge and its importance for research and development of digital technology.

Overall Objectives

E2-Create has four main objectives:
Gain an understanding of principles of embodied creativity. Evaluate existing ML models and computer simulations for their suitability in dance. Develop new ML models and computer simulations for dance creation. Disseminate project results among artists, scientists, students and the general public.
Training

Interviews were conducted with professional choreographers about choreographic methods. Motion capture recordings and interviews were conducted with professional dancers to study forms of movement representations.

Survey

Two literature surveys were conducted, one on generative approaches in the field of dance and the other on methods in computer vision and ML for motion analysis and synthesis.

Development

Two development tracks were undertaken. The first track dealt with the analysis of dance and included the development of wearable sensors and software for analysing motion capture data. The second track dealt with the creation of synthetic forms of dance through ML and computer simulation and included the development of: a co-creative ML tool for choreographers named "Granular Dance", an ML-based interactive artificial dancer named "Puppeteering AI", a motion to raw audio translation ML system named "RAMFEM", a ML systems for creating expressive movements for non-anthropomorphic morphologies named "Expressive Aliens", a simulation named "Strings" for creating generative instruments based on the principle of vibrating strings, and a simulation named "MOQUAM" for translating idiosyncratic movement qualities of robotic moving lights.

Creation

The concert "Strings P" was made in collaboration with a violinist and computer musician. This concert employs an acoustic and two generative instruments.
The dance installation "Artificial Intimacy" was made in collaboration with a choreographer and computer musician. This installation stages duets between a human and an artificial dancer.
The dance performance entitled "Embodied Machine" was made in collaboration with a choreographer, computer musician, costume designer, light designer, and makeup designer. This production abstracts idiosyncratic movement qualities through live motion capture, generative simulations, and sound synthesis algorithms.
Datasets

Two dance datasets were released. These datasets are exemplary because they complement quantitative data with the subjective reflections of the participating dancers or choreographers.

Development

“Granular Dance” demonstrates how ML can be combined with a sequence blending method that is inspired by computer music. This method extends the creative possibilities of motion synthesis beyond state of the art (SoA).
“Puppeteering AI” introduces two novel methods for a human dancer to control an artificial dancer in real-time. These methods go beyond SoA since they provide an interaction that is more intuitive than a direct control of a ML model.
RAMFEM is the first ML system that translates motion data into raw audio. This approach goes beyond SoA since it allows to automatically realise motion sonification systems that reflect the idiosyncratic approaches individual dancers.
Expressive Aliens combines reinforcement learning, expressive movement descriptors, and a physics simulation. This approach goes beyond SoA since it can be used to create expressive movements for artificial characters with arbitrary morphologies.

Creation

The creative productions go beyond SoA because of the integration of skills and aesthetic interests of professional performers into the development of simulation-based generative instruments (Strings P) and a ML-based artificial dancer (Artificial Intimacy) and because of the combination of live motion capture, idiosyncratic choreographic principles, and simulation-based behaviours (Embodied Machine).

Impacts

Art

E2-Create makes its main impact on the artistic fields of Dance and Technology, Creative Coding, and Generative Art.
Practitioners in Dance and Technology employed software and sensors to translate their expressivity into music and light and to choreograph and rehearse with artificial dancers.
Creative Coders employed ML models to develop generative musical instruments and interactive systems that detect or generate dance movements.
Practitioners in Generative Art were provided with generative systems that illustrate how bodily creativity can be abstracted and how ML techniques and traditional generative methods can be combined.

Science

E2-Create makes its main impact on the academic fields of Movement and Computing, Human Computer Interaction, and Computational Creativity.
Scientists in Movement and Computing were provided with motion capture recordings of professional dancers, with procedures for deriving higher level movement qualities, and with generative methods for simulating these qualities.
Scientists in Human Computer Interaction were provided with methods for establishing intuitive forms of interaction with ML models and with sensors for exploiting minute body movements as interaction modality.
Scientists in Computational Creativity were provided with an ML model that paves the way for future research on how ML benefits from the creativity employed by dancers.

Public

Through performances and process documentations, the audience learns how digital technology is adopted and developed in dance productions, how generative methods unite technical and artistic ideas.
Through workshop showings, the audience encounters how sensors, sonification, and ML foster creative experimentation in dance, and how dance provides a context and inspiration for artists who work with ML and generative methods.
Through public panel discussions, the audience is informed about the potentials and challenges of artists employing AI for realising works, of teaching AI to artists, and of artists contributing to the development of AI.
Dance Performance "Embodied Machine"
Audiovisual Concert "Strings P"
Dance Installation "Artificial Intimacy"
Mein Booklet 0 0