SmartSketches is a user-centred approach to introducing computer-based tools in the initial stages of product design and development. CAD systems possessing great functionality now enable us to manufacture very complicated models. However, computers have yet to become usable at the early stages of product design, where pencil and paper still reign. We look to improve usability of product design systems in several important areas, through a User-Centred Design Approach, from Personal Design Assistants to Virtual and Augmented Environments. This project aims at developing the state of the art in User Interface Design and Multimodal interfaces for extended and ubiquitous design Environments. We will provide ready-to-use encapsulations of interaction techniques that can be adapted to other systems through a well-defined API, available to others as open-source.
Alternative input modalities and emerging interface technologies form the basis for a new generation of sketching applications: Calligraphic Interfaces. These remove artificial dialogue constraints, allowing designers to work with the computer much as they would with more traditional media, to capture rough shapes and ideas. We look to improve the usability of product design systems in several important directions, through a User-Centred Design Approach, from Personal Design Assistants (PdAs) to Virtual and Augmented Environments. One direction will focus on handheld PdAs for mobile input, combining pen and speech with retrieval of remote product and geometric data using sketches. Another direction will use sketches to create technical designs through novel input techniques in 2D and 3D, exploring multiple modalities. To create geometric information we combine the flexibility of sketches with constraints and pattern recognition techniques to achieve a powerful control mechanism.
SmartSketches is a user-centred approach to introducing novel tools in the initial stages of product design and development. Computers have yet to become usable at the early stages of product design, where pencil and paper still reign. This is because current interfaces require designers to leap large conceptual gaps from their mental model to geometric shapes, dissociating the early design phase from detailed product definition. The result: longer product cycles and inflexible separation between design and engineering, raising manufacturing costs and time-to-market. Creating complex models and drawings is a painful task using current systems. These are hard to learn and the final product is difficult to visualise, which explains users' preference for pencil and paper. Pervasive interfaces will enable a seamless transition from interacting with PdAs to manipulating the same information in an interactive design desk, or an immersive environment, creating what we call Extended Design Environments. Creative designers often re-use data from previous projects, publications and libraries. We will develop new techniques, taking advantage of their natural ability at sketching and drawing to retrieve data from project and component libraries. The research to carry out combines novel interface modalities such as pen input, sketches, gestures, speech with mobile and virtual environments to replace cumbersome and unnatural input methods typical of present-day interfaces.
Briefly, we intend to work on:
1. Cognitive Task Analysis and Specification of Multimodal Interfaces;
2. Visual languages to parse sketch and Multimodal input;
3. Walk-up interfaces for virtual and augmented environments;
4. Efficient search algorithms to match sketches against multimedia databases;
5. Intelligent user interfaces;
6. Interactively building 3D models from sketches;
7. Usability studies and empirical evaluation of Multimodal interfaces.
All dates in Months (M) after start of project: User Requirements and Task Analysis @6M, Workshop on Calligraphic Interfaces @6MAPI Specification @6M, First Prototype: Sketch editor for modelling @12MField and User tests - Tech Report @20M3D Sketching in Immersive Environments @24M, Constraint-based Scene Modeller @24M,Sketch-based Retrieval, MCAD Drawings @24MIntegration and final prototype @27M, API Available @27M User Evaluation Tests @29M, Wrap-up workshop @30M.
Funding SchemeCSC - Cost-sharing contracts
60314 Frankfurt Am Main
2431-904 Marinha Grande
80038 Pomigliano D'arco (Napoli)