Skip to main content
Weiter zur Homepage der Europäischen Kommission (öffnet in neuem Fenster)
Deutsch Deutsch
CORDIS - Forschungsergebnisse der EU
CORDIS

Holographic eXtended Reality (HXR) for true Augmented Reality (AR) and lifelike immersive displays

Periodic Reporting for period 1 - HXR (Holographic eXtended Reality (HXR) for true Augmented Reality (AR) and lifelike immersive displays)

Berichtszeitraum: 2024-03-01 bis 2025-03-31

The past decades have been dominated by innovations in mobile computing, but today we are seeing a major shift and acceleration in the sector towards spatial computing, where users can interact more intuitively with virtual content by displaying it directly on top of their view of the real world, allowing physical elements to be augmented with a wide range of information. With spatial computing, users no longer need to take their mobile phones out of their pockets and are no longer limited to the small screen of these mobile phones; instead, the entire physical environment becomes a canvas on which digital information can be displayed. The road to spatial computing is certainly not straightforward and has a long history. Several major technology companies have attempted to bring a spatial computing device to market in the form of a VR headset or AR glasses. Think of Google, which introduced Google Glass in 2013, Microsoft with its HoloLens, Apple with its Apple Vision Pro, or Meta, which unveiled a lab concept of AR glasses codenamed Orion in September 2024, with an estimated production cost of between $10,000 and $20,000 per unit. None of these VR headsets or AR glasses ever made it to market or became a big success. The reason for this is that these technology companies were limited to standard 2D display technologies, such as OLED and microLED displays, which suffer from shortcomings for VR headsets and AR glasses such as vergence-accommodation conflict and fixed focal depth. Workarounds to minimize the vergence-accommodation conflict and/or fixed focal depth results in bulky, heavy, and very expensive systems, in some cases completely isolating the user from the outside world.

Swave develops a revolutionary spatial light modulator display technology that overcomes all the limitations of standard 2D displays and will enable fashionable, lightweight AR glasses at a highly competitive price. Although spatial light modulator displays are not new – LCoS and micro-mirror-based SLM have been developed before – Swave's SLM is the only SLM display that can scale the pixel pitch to well below the wavelength of visible light, which ranges from about 380 nm to 750 nm, allowing digital holography with an extremely wide field of view (FoV) for the first time. Other SLM display technologies suffer from a pixel pitch that cannot be scaled below 1 µm, making the FoV too narrow to be practical for AR glasses. Swave will bring its SLM display technology to TRL 8 under the EIC project.
The main activity performed by Swave under the EIC project is to bring Swave's SLM chip to TRL 8. The activity starts with a complete electrical and optical characterization of Swave's fully programmable prototype SLM chip. The learnings gained during this characterization will be used in the design of the follow-on SLM chip being developed under the EIC project.

Swave wants to offer its customers a complete solution. Therefore, Swave also works on real-time computation of hologram patterns for rendering on the SLM and on optical subsystems tailored to AR glasses.

The activity focused on real-time computation of hologram patterns started with identifying the type of Computer Generated Holography (CGH) method that has the potential to achieve real-time computation of hologram patterns. For a first generation of AR glasses, Swave opts to display a single 2D image in space, with the focus depth of the image programmable frame by frame. The identification of the CGH method was followed by the actual development and optimization of the real-time GCH algorithm, and subsequently by the implementation of the CGH algorithm on FPGA. Swave achieved a first complete FPGA implementation of its real-time CGH algorithm in January 2025. The next steps that will be carried out under the EIC project are to further optimize the FPGA implementation to increase frame rate.

The activity focused on optical subsystems tailored to AR glasses includes the development of a miniaturized light engine and an optical combiner. The miniaturized light engine contains a light source for illuminating the SLM and is designed for integration into the frame of AR glasses. The optical combiner is integrated as part of the lenses of AR glasses and is transparent while allowing the user to see the virtual content.
Swave is pushing the boundaries of current technology in all the tech areas it's working on as part of the EIC project.

Swave is the first to develop a new SLM technology that can scale the pixel pitch well below the wavelength of visible light. With existing SLM technologies, such as LCoS-based and micro-mirror-based technologies, it is inherent in these technologies that they cannot achieve a pixel pitch of 1 µm or less. The advantage of a subwavelength pixel pitch is a large FoV, which is a key requirement if you want to use the SLM as a holographic display.

Also with regard to CGH algorithms, Swave goes beyond the current state of the art within the framework of the EIC project, as it is the first to develop a full FPGA implementation of a CGH algorithm capable of calculating hologram patterns for a subwavelength SLM pixel pitch in real time.

Last but not least, Swave also far exceeds the state of the art in terms of optical subsystems. No other company was ever challenged to develop a light engine or optical combiner for an extremely wide FoV, because there was simply no SLM capable of generating such a wide FoV. As a result, the wide FoV can quickly lead to large optical components in the light engine, making it difficult to miniaturize the light engine for integration into AR glasses. Therefore, smart innovations must be developed to capture sufficient FoV while keeping the light engine as compact as possible. When it comes to the optical combiner, all current “AR glasses” (if you can call them that) use waveguides as optical combiners. However, this is an extremely cumbersome solution, as it reduces optical efficiency by a factor of 1000, is incompatible with prescription glasses, and is extremely expensive. Swave goes beyond the current state of the art with a much more elegant solution for the optical combiner that eliminates the need for waveguides and which is compatible with prescription glasses and low-cost.

Swave has an active patent filing strategy to protect its inventions which can be easily reverse-engineered, and it has a roadmap with risk mitigation measures to reach TRL 9 for all its building blocks.
Swave Team
Mein Booklet 0 0