VividQ and iView Partnership to Realize Real Holographic HUD and More

Darran Milne Source: VividQ

Holographic software provider VividQ and optical modules manufacturer iView have expanded their relationship into a new partnership that will go beyond their current efforts in automotive Head-up Displays (HUDs) (see full press release here). The new partnership will look to expand the use of this innovation engine, coupled with new waveguide technology, into AR applications in commercial, gaming and medical applications.

According to VividQ CEO Darran Milne, the company has been working with iView since 2018 on both automotive HUD designs as well as wearable AR designs. The HUD application is the most mature as the two are working with a major Chinese automotive Tier One supplier who is looking to replace their current DLP-based HUD with one based on holographic imaging. This should result in a full product demonstration by the end of 2021.

“We are very excited about this first-generation engine as is the Chinese Tier One supplier,” commented Milne. “If all goes well, the first production run could be for 200K to 300K units.”

This first-generation HUD design will also offer a rather novel approach to the presentation of the virtual images. Milne says that the left and right sides of the image will contain conventional instrumentation like speed and informational icons.  These will be set at about a 2m virtual depth. In the middle will be a “tilted plane” whose depth can vary from around 7m to infinity.

This middle section will show arrows, guides and more to assist the driver in getting to their destination and provide additional safety features. This is one of the strengths of the Tier One HUD provider who hopes to add real-world locked AR elements in a future design. A second-generation design will also increase the number of planes in this middle section.

The Field of View is 10×4 degrees with a 150 x 50mm eyebox – fairly typical specifications for HUDs today. But the novel informational layout and rather limited FOV raises a concern about packing too much information into this FOV. “We have suggested that only about 20% of the pixels be used at any time to avoid the concern about information overload,” said Milne. “Our system provides an angular resolution of about 80 pixels per degree, which we think is just fine for the type of information that will be presented.”

VividQ Prototype Headset Source: VividQ

Overall, the HUD design is more compact at less than 10 liters compared to more conventional HUDs at 15-20 liters. The second-generation design could be less than 3 liters, so a vast improvement.

The optical engine being developed by VividQ and iView features an LCOS imager from Himax with 1920x1080p resolution. But it is not driven in a conventional raster method.  Instead, it is driven in a phase mode with VividQ writing a computer-generated diffractive pattern, i.e. hologram, to the imager.

When illuminated by red, green and blue lasers, the light interacts with this hologram to diffract light only into parts of the FOV where light is needed.  That means, if only a single symbol were to be displayed, all the light goes to that symbol allowing it to be very bright. In a raster-based solution, the light is always spread out over the full FOV, so the brightness of that symbol would be much lower than in a diffractive system.

Milne says their first-generation HUD will use RGB lasers but a second-generation should be able to move to LED sources.  LED sources would also eliminate the need for speckle reduction of the laser sources.

One of the ways to reduce speckle is to add an optical element that moves the light or imager slightly to destroy the phase information. That’s a problem with a diffractive system as it relies on phase information to create the image.  “With some clever software, we were able to use a moving element for speckle reduction and retain a clean image,” said Milne.

Another concern with diffractive system is the compute power needed to generate the holograms. “We have solved this problem too,” said Milne. “Hologram generation can now run GPUs typically found in phones or in current automotive HUDs.”

As for non-HUD-based developments, Milne says they have developed a novel waveguide solution that is optimized for their diffractive optical engine.  Under this new partnership, they will be working with iView to develop initial prototypes in the coming months.

To learn more about the applications of Computer-Generated Holography, download VividQ’s latest whitepaper, ‘Holography: The Future of Augmented Reality Wearables’ here.

share this post