Periodic Reporting for period 1 - MILC (New system for Automatic Music identification of live events and cover versions)
Reporting period: 2017-09-01 to 2018-08-31
The objective of the MILC project is to to set the basis for a new innovative world-leading and cutting edge music monitoring service for public venues and on-line platforms, investigating and developing various technologies that would allow BMAT to offer a solution to this issues, and thus become the only provider, internationally, capable of monitoring and identifying music on TV, radio, venues, live concerts and internet. The success of MILC will convert BMAT as a landmark for collecting management organisations while participating in improving the transparency, efficiency and impartiality of the management and administration of royalties. BMAT is nowadays the fastest growing and world leader company in music monitoring services for CMOs, leading the international market of TV and radio monitoring services with VERICAST. With MILC, BMAT could become the global leading data provider for rights distribution. To achieve this, MILC will focus on the following objectives:
1) exploring the improvement of the fingerprint technologies used in BMAT recognition services to tackle DJing manipulation of audio,
2) reduce the complexity of the actual prototypes of identification algorithms for recognition
of song re-interpretations and versions,
3) explore the technical feasibility of both in a unique algorithm of fingerprinting extraction suitable to BMAT portfolio of monitoring services.
The objective of adapting BMAT’s algorithm for identification of re-interpretations to large scale music research (millions of songs) reducing the extraction to local and indexable music information of songs segments instead of entire songs, has resulted in Improving maintainability of the covers project; implementing exact start/end times for the identifications in covers; reducing the sample rate needed for covers detection;
developing a new streaming mode for covers detection where the input audio file can be a long recording containing multiple covers (e.g. concert recording) instead of just a song; researching and experiment with indexing solutions for covers detection (mainly based on locality-sensitive hashing, LSH) to improve the scalability of the retrieval.
As for the integration of algorithms developed for re-interpretation identification with modifications-robust fingerprinting algorithm, we worked on an prototype of a new BMAT product that combines audio fingerprinting and cover detection. Given a collection of audios, the product aims to build a graph where the nodes are the audios and the edges show the relationships between them.
Pre-processing the audio has shown great potential to deal with very low background music scenarios. Due to its great results, this technology has been used experimentally for all pilots carried out by BMAT since then.
Covers technology. It has been tested already in many pilots that required live/covers detection, e.g. generate concert setlist, identify non-authorized covers in youtub.
Integrated product (fingerprinting + covers): actively developed and currently used for cleaning the catalogue of one of BMAT’s key clients.