European Commission logo
English English
CORDIS - EU research results
CORDIS

Intelligent Sensor Enabled Eyewear

Article Category

Article available in the following languages:

Smart eyewear brings both style and substance

Intelligent eyewear could benefit many people, whether their eyesight is poor or not. The I-SEE smart eyewear detects posture, screen-time use and even UV exposure – while keeping style at the forefront

Digital Economy icon Digital Economy

An EU project has developed cutting-edge eyewear for the ‘internet of things’ era, that could improve both the health and well-being of wearers – whether they have poor eye-sight or not. The glasses track viewing and living experiences and can connect to other smart devices such as smartphones, in-car telematics and those used by opticians. The Horizon 2020-funded I-SEE project leveraged the expertise, knowledge and market access of Luxottica, a world-renowned eyewear specialist and leader in the industry, to create glasses that incorporate style into the production as well as functionality. The value proposition of smart eyewear already on the market is significantly unbalanced in favour of function and technology, with a negative impact on aesthetics. The approach used in the I-SEE project was completely different, shifting the focus from function to form. First, the design of the eyewear was driven by maintaining the aesthetical standard of the fashion industry, and then by the provision of wellbeing functionalities to exploit the ‘privileged’ position of the eyewear on the head of the users. “The market is still under development, and we believe I-SEE has the potential to contribute on shaping it,” says Mr Fabio Borsoi, quality manager in the I-SEE project and Global R&D Director, Technology at Luxoticca. Life and work through new glasses There were several product releases, to accommodate different stakeholder requirements. A ‘data science’ version of the glasses was released that streams raw sensor data to a connected app. Developers and researchers can leverage data from several sources: such as light, UV, temperature, pressure sensors, an accelerometer and a gyroscope. These are integrated within the eyewear, to perform scientific analysis, or to propose potential use cases for further research and development. Another, more consumer-focused version runs algorithms directly within the glasses. These algorithms detect, analyse and monitor the wearer’s behaviour, including neck posture, screen-time exposure, and other factors relating to overall well-being. These glasses are semi-autonomous, and just need the connected smartphone app to collect data. Personal targets can be set to monitor and improve physiological activities like posture, for example. Innovative cases “A great part of the project was dedicated to developing ‘use cases’ involving activity monitoring, aimed at improving the wellbeing of the user,” says Borsoi. These guided the development of the electronics and software, as well as influencing the design of the user experience and the research behind the algorithm. For example, the team implemented and tested an algorithm for monitoring neck posture, and a notification system to be used via the eyewear LED and the connected app. The aim here is to reduce and prevent the muscular stress accumulated on the neck resulting from bad posture. Another use of the glasses is to monitor time spent in front of a laptop, by combining ‘tailored’ light sensor technology with machine learning techniques. The user can set his/her goals, and the eyewear will help to achieve that target by notifying when the limit is approaching. And in the same light-related vein, the team integrated a UV sensor into the eyewear, that enabled them to develop an algorithm that estimates the user’s skin exposure to sunlight. The glasses connect to a dedicated app allowing the wearer to take advantage of these uses. “The interaction between the eyewear and the smartphone allows the wearer to have completely new experiences and enables the product to be integrated into the smart-ecosystem,” explains Borsoi.

Keywords

I-SEE, glasses, smart eyewear, machine learning, UV sensor, screen-time exposure, data science, smartphone

Discover other articles in the same domain of application