RIMM is a trial of contemporary technologies for the generation of multimedia content and semi-immersive performance environments. It will enable the interactive, multi-dimensional user control of high performance computer technologies. The project will investigate the application of technologies through the realisation of an original composition, a Hyper-concerto for Soprano Saxophone and Multiple Media. This work will unite computer music technologies, surround sound and computer graphic generation. RIMM will devise an appropriate combination of hardware and software technologies to allow all elements of the composition to be under the direct control of the performer. The project will use technologies to track and sense the natural gestures and movements of a saxophonist and map this information to software signal processing application and object-oriented sound and graphic generation programs.
The project will evaluate the application of new human-computer interaction(HCI), sound and graphics technologies in the context of a practical application. The sophisticated motor skills, developed over many years by an instrumentalist will be applied to the control of high performance computers and software - a technological development of an acoustic instrument to produce a hyper-instrument, enabled by new technologies and controlled by multi-parametric human interaction. Contemporary computers have enabled this by the development of architectures with sufficient power to provide users with the type of responses required to feedback the results of their actions in a truly interactive manner. RIMM's aim is to apply these technologies (specifically a Silicon Graphics Origin 2000 parallel High Performance Computer, Spatialisateur, jMax) in a performance context, and to determine if this type of technology is appropriate for interactive multimedia performance and content generation.
DESCRIPTION OF WORK
The RIMM trial will be conducted over a six month period. The software suppliers (IRCAM Institute of Research and Coordination in Acoustics/Music) will customise their applications for use by the realisation team (users, University of York) and will provide full technical support during the trial. The timetable for the project is divided into four key stages:
1.) The assessment, application and customisation of sensing technologies to allow an appropriate combination of input devices for use with Spatialisteur (Spat.) and jMax for the realisation of the project's technical and compositional goals;
2.) The customisation of Spat. to provide an appropriate Ambisonic B-format output;
3.) An intensive rehearsal period, where the public performances of the work will be prepared and the composition and technologies are assessed and modified according to their ability to provide suitable interactivity;
4.) The dissemination of outcomes and public performances of the work at SIM (Staatliches Institut fur Musikforschung PK), Berlin.RIMM will combine the following technologies: a high performance parallel computer (HPC); two leading-edge computer music software applications - jMax and Spat. both developed by IRCAM and applied in new, customised versions for HPC by RIMM. These technologies will be combined with a variety of existing sensing technologies (including resistive, capacitive, ultra-sonic and video tracking systems) to allow the implementation of multi-dimensional parameter controls for sound and graphics manipulation and generation. An assessment of the ability of the technology applied to produce adequate responses to the performance gestures will be an important aspect of the trial, to determine if a true control intimacy can be achieved within human interactive time-scales, and to attempt to bring the life of human performance to the control of multiple media digital art.
Funding SchemeBUR - Bursaries, grants, fellowships