Skip to main content
CORDIS - Forschungsergebnisse der EU
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Interactive Research in Music as Sound:Transforming Digital Musicology

Periodic Reporting for period 3 - IRiMaS (Interactive Research in Music as Sound:Transforming Digital Musicology)

Berichtszeitraum: 2020-10-01 bis 2022-03-31

IRiMaS is a five-year project (2017-2022) funded by a €2.5m European Research Council Advanced Grant. This ground-breaking project’s aim is to harness the potential of C21st technology to research interactive aural approaches to music analysis, and to musicology more broadly. Building on earlier work, the project is devising generic software tools and specific case studies to develop and demonstrate the potential of working with software to facilitate direct engagement with music as sound as part of musicological research. The project significantly expands ideas and techniques previously explored in the context of electroacoustic music in the AHRC-funded project TaCEM, Technology and Creativity in Electroacoustic Music and in earlier work on Interactive Aural Analysis. It aims to further develop this approach and make it applicable across the whole range of different musics.

In an age when technology is revolutionizing the ways in which music is made and distributed, the IRiMaS project aims correspondingly to transform approaches to musicology, moving from a primarily fixed, text-based approach to one that incorporates as an integral feature the interactive and aural. It brings skills and expertise from music technology to assist in the development of research strategies and software to enable musicological research to engage more directly with sound. It will pioneer a ground-breaking approach to music research in which dynamic interaction with sound is fundamental and music’s temporal and transient nature are central to research investigations and their dissemination, presenting a significant conceptual challenge to the traditional textual bias of much musical research and leading to new enhanced modes of musicological knowledge. Unlike much current Digital Musicology, rather than using software primarily to extract quantized data, IRiMaS takes interactive engagement with sound as the foundation for research.

The project focuses on three Case Studies, developing the interactive aural approach in specific areas where prioritizing the aural is of particular significance. The Case Studies focus on Spectralism in the field of Contemporary ‘Art’ Music; Tracking the Creative Process in Free Improvisation; and Folk Songs in Performance in the field of Ethnomusicology. Work is also being undertaken on how such an interactive aural approach might assist in investigating traditionally notated Western music, giving more prominence to elements such as timbre, texture, articulation and dynamics. Each study aims to produce substantial ground-breaking musicological outcomes in the form of software packages (as is appropriate to the nature of the project) and associated articles discussing the approach taken. The Case Studies also assist in working towards the development of models and generic tools to help establish the wider adoption of this interactive approach. IRiMaS builds on the PI's previous experience in related areas, in particular in his work in developing an interactive aural approach to the analysis of electroacoustic music. The project brings together researchers with expertise in a variety musicological and technical backgrounds working together to help realize these ambitious aims.
The project is devised as having three phases. In the first phase of the project (M1-12) initial exploratory musicological research was undertaken and the basic foundations for the software were developed. At the start of the second phase (M13-48) three postgraduate students joined the project and work began on the three major Case Studies for the project. A substantial part of the development of the software toolbox is also taking place during this phase. Throughout we have continued to foster ‘blue-sky’ thinking and as a team have read widely and discussed innovative ideas.

We are currently midway through this second phase and an exciting and diverse range of musicological topics are being pursued in the areas of ethnomusicology, free improvisation and contemporary spectral music. The software is still (as planned) under development with succession of prototypes being trialled by musicologists and feedback provided to the programmers to aid further development in an ongoing iterative process. The software, called TIAALS (Tools for Interactive Aural Analysis), is designed as a toolbox for use by any musicologists, including those with no special expertise in computing. A key innovative feature of this software is the way a variety of tools and functions are brought together in a coordinated environment so as to facilitate the creation of analytical presentations that seamlessly integrate audio and visual components with textual content, and to enable both analysts and their readers to play interactively with these resources as a way of generating new knowledge about the music. In this way, an interactive and aural approach to music analysis (in its broadest sense) is foregrounded. For the most part, the algorithms used are not new, the major innovative feature is the way they are integrated into a unified and easy to use software environment. Another feature, included in the original proposal but becoming more significant as the project progresses, is the way that such software can foster an approach to analysis that is more participatory and interactive in terms of the relationship between analysts and their readers. The interactive aural approach, using software, makes it possible for analysts, rather than producing fixed results printed on paper, to create environments where ‘readers’ can play with the music and analytical ideas, developing their knowledge of the music through active aural engagement. To date the TIAALS toolbox includes the facility to load sound files, segment them, make sonograms and manipulate these, load video files, segment and annotate them, add textual commentary and make analytical charts etc. With both audio and video, a key feature is the potential for musicologists to use these tools to create comprehensive interactive analytical environments.

The structure of the project means that its major outputs are scheduled for near the end of the project but we have been presenting our work in progress at many major international conferences. So far, we have presented at 14 conferences, and had another 9 papers accepted for conferences in 2020 (many now delayed), covering a range of musicological and technical aspects of the project, including: the 2019 International Computer Music Conference in New York; the 2019 Society for Music Analysis conference in Southampton and its Summer School; the 2019 Society for Music Theory conference in Ohio; the Analytical Approaches to World Music conferences in Birmingham (2019) and Paris (2020, postponed to 2021); the British Forum for Ethnomusicology Annual Conference (2020,now 2021); the 2019 Spectralisms conference at Ircam, Paris; the Performing Studies Network Conference 2020 (now 2021); and two papers accepted for the 2020 European Music Analysis Conference (EuroMAC) scheduled for September in Moscow.
New methodologies were central to the original IRiMaS proposal and remain so. Traditionally the results of musicological inquiry have usually been printed on paper, verbal accounts with the addition charts and diagrams. (Hans Keller’s functional analyses and Heinrich Schenker’s graphical representations are rare exceptions). This was perhaps inevitable in the past but technology now enables us to incorporate direct engagement with aural and visual material into both the methodology of music research and the presentation of its findings. Musicologists have increasingly been linking their texts to online audio examples but this is far from the level of integration and interaction that we are seeking to develop. Researchers have also increasingly employed software to extract statistical data from scores or recordings, and also to work with Big Data from across large repertoires using AI and other algorithms to discover new trends and features. All this can indeed be valuable, but the originality of IRiMaS is to use computers in a different way (though we may incorporate aspects of some of these approaches): IRiMaS seeks to use software so that musical sound, and the experience of hearing this sound, can become central to the research methodology and its presentation. Musical sound is continuous, flowing and temporal whereas scores, words and data extracted from recordings are static and discrete, and whilst they may provide useful insights, isolated from sound, they may also distort or limit the investigation. So, building on earlier work relating to the analysis of electroacoustic music, IRiMaS aims to expand an innovative ‘interactive aural’ methodology for music analysis to a much wider repertoire. In doing so we are focusing especially on those musics that do not fit so comfortably with traditional Western notation. For example, improvised music, oral/aural traditions in world music, and contemporary music in which the use of extended playing techniques and concern with timbre stretch traditional notation to its limit. These areas underpin the three main Case Studies for the project, but our research is also incorporating some traditionally notated music and investigating whether our approach might facilitate deeper consideration of aspects of such repertoire that are often neglected such as timbre, texture, articulation, dynamics etc.

Such an approach also encourages the integration of areas of music study that sometimes remain separate, for example, for example, analysis, music history, performance studies, music psychology, world music and ethnography etc.). Because our software tools facilitate the creation of software packages that can incorporate audio and video recordings of performances, interview recordings, interactive analytical charts in sound and so on, it is much easier to bring different areas of study into dialogue with each other. We are already seeing some examples of this.

Another innovation is that, whereas much music software requires specialist knowledge, the IRiMaS software is intended to be of use by any musicologist with basic computing skills. This is important because our ambition is that these tools will become a standard approach, not just a niche sub-discipline.

Such software also has the potential to change the working relationship between the musicologist and their audience. If the materials are interactive there is more potential, more encouragement, for readers to explore for themselves, to develop additional or alternative interpretations of the music. A leading music researcher, Kofi Agawu, has promoted the idea of music analysis as ‘a mode of composition or a mode of performance’. We aim for our interactive software to encourage a creative, performative approach to music analysis both by the researchers and their audiences. This is something we already see in some work we have completed and something that seems to be growing in importance as IRiMaS evolves.
TIAALS software - linking scores to sounds to waveforms
TIAALS - Mapping audio and video materials
TIAALS - Dynamic highlighting and annotation of videos and sound
TIAALS - New dynamic visualisations of sound
TIAALS - Creating analytical charts from sonograms and waveforms