In recent breakthrough work at the boundary of robotics and computer vision, I demonstrated that the core `Simultaneous Localisation and Mapping' (SLAM) approach of probabilistic map-building for mobile robots can be applied to real-time 3D motion estimation from the image stream from a single agile camera. Now I propose to take this line of research to its logical conclusion by investigating a paradigm of `Instant SLAM'. Can monocular visual SLAM be pushed far enough to estimate in real-time the motion of cameras potentially moving and accelerating very rapidly, turned on at arbitrary times and pointed at arbitrary scenes? Is is possible to develop a real-time vision algorithm, running on the standard processors of today or the near future, which can accurately track the motion of a camera attached to flying or bouncing robot, carried by a running person or even just thrown across a room? The ability to track camera motion with so few restrictions on dynamics and prior knowledge would turn simple webcam-type or embedded camera modules into truly flexible, low cost, go-anywhere position sensors with any number of applications in robotics, wearable computing and beyond. Achieving this goal requires both theoretical and practical research on extracting with ultimate efficiency the motion information available in an image sequence, with emphasis on high frame-rate capture, information-theoretic analysis, optimised probabilistic filtering and the relaxation of prior assumptions.
Field of science
- /engineering and technology/electrical engineering, electronic engineering, information engineering/electronic engineering/sensors/optical sensors
- /engineering and technology/electrical engineering, electronic engineering, information engineering/electronic engineering/robotics
- /natural sciences/computer and information sciences/artificial intelligence/computer vision
Call for proposal
See other projects for this call