European Commission logo
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS

Seamless mixing of virtual & real-world objects in VR & AR

Periodic Reporting for period 1 - LIGHTFIELD (Seamless mixing of virtual & real-world objects in VR & AR)

Période du rapport: 2020-04-01 au 2021-03-31

Virtual and augmented reality (VR/AR) is expected to be a major technological leap opening doors to new applications and technologies with the aim to help people in everyday life in everything, from cooking to neurosurgery.

However, today VR/AR headsets have not yet entered mass adoption. One of the major reasons for that is the visual discomfort and eye strain of users. To correctly perceive depth, the human eye needs to change focus between objects in different distances. This is impossible in today’s devices based on flat screen technology. The mismatch between binocular depth and focus (vergence-accommodation conflict) causes eye strain, pain, and nausea for users. It is a serious problem in VR, but it is a true showstopper in the AR where our eyes see both real and virtual objects at the same time.

CREAL’s patented light-field technology generates truly 3D hologram-like images, our technology fully replicates human perception of 3D images with optical depth. It solves the eye strain and pain problems of current headsets, while increasing immersion of users and allowing proper mixing of virtual and real worlds.
CREAL develops a unique 3D light-field display for VR/AR headsets. However, the display control requirements for this new kind of display are very specific and different from existing displays. Therefore, to achieve optimal performance and form factor, we need to develop a custom dedicated display driver on an application-specific integrated circuit (ASIC), which is the goal of project LIGHTFIELD supported by EIC Accelerator Pilot grant.

In the first 12 months, we complete following technical tasks
- We investigated suitable image processing algorithms allowing lighter computation, smaller data bandwidth, and better light-field image quality, and are also suitable for ASIC implementation architectures. Additionally, we have specified the overall system architecture, the communication protocol from the source computer, the control interfaces, and other low-level subsystems of the device.
- We implemented the said image processing algorithms on a high-end field programmable gate array (FPGA) using a development board.
- We also implemented a dedicated low level spatial light modulator (SLM) driver on FPGA.
- We implemented more stable and optimized video processing and data transfer (more robust, lower latency, etc.).

On business development and communication side:
- we continue working and established new collaboration with various R&D groups in industry and academia, but restricted travel made demo meetings difficult,
- we continue refining our go-to-market strategy with our EIC coach,
- we continuously build awareness of the problem we solve, the “VAC” and “visual conflict” in VR/AR, as well as advantages of our solution through materials and lectures.

We closed a new financing round of CHF 6.5 M in October 2020, which will allow us to bring our light-field display technology from the current hardware-development-kit stage to the complete technology package for the next-generation smart glasses.
CREAL’s mission is to resolve the key problem blocking widespread adoption of virtual and augmented reality by developing near-eye light-field technology. The ultimate goal is to provide the key technology for the next generation of wearable VR/AR products, and smart glasses, allowing for various new everyday-life and professional applications.
AR light-field headset prototype
VR light-field headset prototype
Future AR glasses
VR light-field headset prototype using FPGA development board