Wspólnotowy Serwis Informacyjny Badan i Rozwoju - CORDIS


PELOTE Streszczenie raportu

Project ID: IST-2001-38873
Źródło dofinansowania: FP5-IST
Kraj: Finland

Personal navigation system for indoor environments - a testbed

In a rescue situation it is essential to know the position of all human entities. Above all this is a safety issue, but also the modelling and visualization of human perception data depends on accurate position information. Knowing ones position allows efficient coordination of the teams. It also makes sharing of spatial information possible. Sharing of the spatial information is seen as a key factor when modelling a common model presence between humans and robots.

Unlike robots the humans only rarely know their accurate position. Human beings use mostly eyes for localization, which is based on recognition of an object and estimation of the distance to the object. If the sight is reduced by darkness, smoke, etc. the accuracy of the localization suffers. Also, localization is always relative in nature. Only in situations when human can identify a known object, like corridor crossing or stairway, he can know his accurate position relative to that object. Thus the transfer of the position information to another person or for a computer system can be difficult. A Personal Navigation System was developed to solve these problems. In this system localization is based on dead reckoning and laser scan matching. These methods are widely used in mobile robotics, but less common in human localization.

Short Description of PeNa System:
The Personal Navigation system (PeNa) is a system for localizing a human in indoors. In rescue situation infrastructure for localization cannot be assumed. Thus, the system is designed to be a stand-alone localization system. The localization is based on dead reckoning and map-based localization. Dead reckoning includes step measurements (pedometer), magnetic sensors (compass) and inertial measurements (gyro and accelerometers). The laser range finder is used for mapping and position refinement, which is done by laser odometry and map matching.

The PeNa hardware includes: batteries, power conversions, Stride Length Measurement Unit (SiLMU), a fiber optic gyro, a compass, SICK laser scanner and a laptop. All the hardware is mounted in a backpack. The laptop is installed in the front to serve also as a display for user interface.

The existing PeNa hardware is a demonstrational prototype that is able to localize human in indoors with bounded error. The fact that PeNa does that by using no predefined infrastructure makes it unique in relation to previous work. The system is used to incorporate human into a telematic system of hybrid entities. In this sense the PeNa is unique. It allows the operator to control human as if the human would be teleoperated robot.

Future Work and application potential:
The personal navigation has application potential as is. The localization of human allows building applications such as PeLoTe. It also allows the location-based services, which is hot topic at the time being. Furthermore, the ever-growing intelligence in the living environments has a clear need for new type of interfacing. As seen in the PeLoTe the interfacing through location and conceptualisation is efficient. The sharing of information, robot programming and information visualization becomes easy and the conceptualisation allows the use of natural language in commanding entities.

The environment conceptualisation is made in two parts:
- A part, where PeNa is used for automatic (or semiautomatic) mapping of the environment, and
- Another part where the located objects in the environment are conceptualised using human cognition. The (semi-) automatic mapping is done through sensor fusion. PeNa will be upgraded with PMD or similar 3D perception sensors to extract objects in the environment. Human conceptualises these objects. The conceptualisation includes giving name, properties and functions (that can be anything).

The outcome is a virtual description of the real environment with large amount of objects. This virtual environment can be used as interface to service robots, home automation systems, control of surveillance equipment, etc. In practice this kind of world allows the control of any device that is part of the system (static, dynamic) and the applications variety is unlimited.

Powiązane informacje


Aarne HALME, (Head of Automation Technology Laboratory)
Tel.: +358-9-4513300
Faks: +358-9-4513308
Adres e-mail
Śledź nas na: RSS Facebook Twitter YouTube Zarządzany przez Urząd Publikacji UE W górę