WEBKITProject ID: IST-2001-34171
Intuitive Physical Interfaces to the WWW
Total cost:EUR 3 437 049
EU contribution:EUR 2 058 965
Coordinated in:United Kingdom
Funding scheme:CSC - Cost-sharing contracts
As an alternative to GUIs, we propose a Tangible User Interface (TUI) for navigating the WWW that places priority on direct manipulation - users control the system and navigate through information by selecting and positioning physical objects, not just representations on a PC screen. This is genuinely multimodal, combining a physical element with digital representations (graphics, audio) that provide a dynamic element - showing how manipulations change the underlying data. The topic of searching and retrieving Internet information has not been addressed by previous TUI work. WebKit combines an examination of novel multimodal interfaces with innovative work in semantic web technology and information management. We will develop an innovative end-to-end system, expected to yield new IPR (e.g. physical query language paradigms) and commercial exploitation opportunities. The system will be extensively tested in schools and homes by 5 - 13 year old children.
We will develop a series of Tangible User Interface (TUI) applications for information search and retrieval, to validate and refine them, and identify those with greatest exploitation potential. We will develop an end-to-end platform which will be specifically optimised for WWW applications, and which will permit rapid development of new applications. We will trial this system extensively with pupils and teachers in at least 2 schools, already identified, together with a number of homes. Trials will be a significant part of the project, lasting at least 12 months. We will evaluate the results and use this to further refine the applications and underlying platform.
An object-based interface will be created using RFID (radio frequency identification) tags, communicating with a remote server to retrieve information stored centrally or on the WWW. Users will trigger searches by the selection and positioning of tagged objects on a reader device. The semantics of object-based searching and the interrelationship of objects to create complex queries, will be a major part of the work. Cognitive psychology, industrial design and systems engineering will be combined to create an interface that is ergonomic, robust and fun to use. A suite of backend technologies for content-based information searching and retrieval will also be integrated, to interpret the commands and queries constructed by the user interface. This will include various personalisation techniques to alter the interface to suit individual needs. A focus will be the creation of a query processor, which can mediate between this multimodal interface, where meaning is expressed through objects and physical positions, and a neutral structured query, which can be interpreted by search engines and used to retrieve online information. Further work will address personalisation of the interface, through both explicit and implicit profiling and the use of agent technologies, in order to allow the system to automatically respond to the needs of individual users. In addition, we will investigate how information retrieved from disparate sources can be ranked and presented to the user in a meaningful and relevant way.
We will deliver a tightly integrated TUI and search/retrieval backend working in tandem. A key milestone will be the delivery of an early prototype which can be deployed and trialled from Month 6. This will be improved and extended during the project. Further milestones include the definition of a physical query language paradigm which can be widely disseminated and used as the basis for interoperable TUI client-server systems. Finally, we will define a range of product/service concepts that can be taken through into exploitation, potentially by toy and/or multimedia manufacturers.