Skip to main content
European Commission logo print header

Article Category

News
Content archived on 2023-03-02

Article available in the following languages:

EN

Metadata labelling for multimedia content

Remember that photo you took with your digital camera in Spain last summer? Now try to find it. Identifying and retrieving the right multimedia content in your PC or company server can be like panning for gold – the nuggets are few and far between. The IST project aceMedia is developing a solution.

With affordable digital cameras, camcorders and camera phones becoming ubiquitous, consumers are increasingly finding themselves facing mountains of unorganised multimedia content. The vast majority of this data is not clearly identified – making finding and then annotating it correctly an unappealing and time-consuming task. By some estimates as much as 70 percent of the multimedia content people produce themselves or obtain from other sources is never viewed again. Instead it is left to languish in assorted folders on PC hard drives. If, however, this content could be found and retrieved more easily, or even categorised intelligently and grouped with new content, it would undoubtedly be accessed and shared more readily. This is the point of the aceMedia system, which uses advances in knowledge, semantics and multimedia processing technologies to support self-analysing, self-annotating and self-adapting content. “People don’t want to have to spend time managing their content manually, they just want to be able to view it whenever and however they want,” says Paola Hobson, the aceMedia project coordinator. “For that to happen, multimedia content needs to become intelligent.” The aceMedia system, which is being developed under the IST project until December 2007, relies on content pre-processing, image recognition and knowledge analysis tools to supply metadata annotations on static and moving images, and on specific parts of those images. A photo of a crowded beach, for example, could be automatically annotated with references to the beach, sea, sky and bathers. Once the photo has this attached metadata specifying the content of the photo and its various components, any of the annotated words, similar words or combinations of them entered into a search engine would identify the photo. In addition, by drawing on the profiles of different users the system could return images that are personalised to the interests and prerequisites of the person doing the search. The system is built around the concept of an Autonomous Content Entity (ACE). The ACE system consists of three technological layers: a content layer that is scaleable and adapts the content to the user's device and how it is viewed; a metadata layer that carries out the semantic analysis and annotation; and an intelligence layer that provides programmability and allows the ACE to act autonomously. In an effort to address privacy concerns, users can also assign restrictions on who can access the content on different devices. In the case of videos, different sections of the video could be annotated depending on what was being shown, and individual segments could easily be extracted and viewed. This 'smart' content can then be adapted by the system for display using a variety of devices. “So far we’ve developed the technology for sharing from PCs to televisions via a set-top box. We also plan to bring mobile phones into the equation, given that today they are not only content-viewing but also content-creating devices,” Matellanes says. Trials with end users are being carried out throughout the course of the project, and have so far drawn positive feedback from both members of the general public and professionals. “People really want and need the capabilities this system offers,” Matellanes says. For content providers, such a system could be a boon to business, particularly when it comes to offering content for sale over the internet. “You can’t make money out of content if people can’t find it, and the cost of annotating content manually is enormous,” notes Hobson. “Automatic annotation will help people find what they want, and they would consume more of it.” Contact: Paola Hobson Motorola Labs United Kingdom Tel: +44 1256 484643 Email: Paola.HobsonATmotorola.com Website: http://www.acemedia.org/aceMedia Source: Based on information from aceMedia

Countries

United Kingdom