Realfiction to Show Innovative Directional Pixel Technology at CES 2024

Denmark-based display developer Realfiction will be showcasing four demonstrations at CES 2024. After an hour-long discussion with the company, these demos are really quite innovative. A brief description is provided below. If you want to learn more, click HERE to schedule a meeting with them at the ARIA.

Demo #1 – This is essentially the same demo as was shown at DisplayWeek earlier this year. It is a proof-of-concept demo that showcases the use of a new very fast switching ferroelectric modulator that creates a multi-zone display with each zone able to display a different image. It features a low resolution microLED display with the F-LCD modulator in front used to create a moving parallax barrier pattern. The new F-LCD modulator uses new materials to offer robust mechanical stability (a prior issue) along with clever drive schemes that mitigate the need to blank the display during the reverse voltage cycle, needed to maintain DC balancing. They call this a Hybrid Scan Display. In addition, this technology forms the basis for replacing the optically-based steerable 2D/3D displays used in the next two demos.

Demo #2 – This and the third demo are both based on a new way to create a directional backlight. They are also both multi-stereoscopic 17” displays. The backlight uses conventional LED and drivers, but the arrangement is different along with the driving algorithm. On top is a horizontal-only parallax lens array and an IPS LCD panel. The idea is to time sequentially drive the LED backlight to steer light through the lens array and LCD.  28 backlight scanning columns are created and scanned in synchrony with the IPS panel using a specially-designed overdrive algorithm. To maintain the full resolution of the panel and achieve a 60 Hz refresh rate, they employ eye tracking technology. Using this data, they project left and right eye images in the zone where a viewer is sitting. While eye trackers to support up to 10 users are in development, the current demo will use an Intel RealSense eye tracker to deliver the image to two different users.

Demo #3 – This demo features the same hardware with a software modification to showcase and automotive application for a dual-view display. The demo drops the eye tracking as the relative positions of the driver and passenger are well known, while the number of zones is reduced to two and the width of the viewing cone is widened for easier access to the image.  Two separate 2D images can then be shown to the driver and the passenger. Since directional backlight in demos #2 and #3 are software driven, they can also be used as conventional 2D displays.

Demo #4 – The fourth demo is based on a novel OLED backplane design for eye tracked light field displays, supporting a very large number of subpixels per pixel. Perhaps the best way to explain this is by analogy. Consider a conventional video wall solution. Each module contains a medium-resolution display with wiring to every sub-pixel in the module. But the module-to-module connection has a much simpler interface, and the entire video wall can be driven over a single cable from a controller. This concept has been extended to an OLED backplane made from LTPS, LTPO or Oxide materials. As with the video wall concept, the backplane is divided into a series of zones, where there are connections to each subpixel in the zone. These zones are then interconnected with a separate network, which also forms the interface to the external display drivers. This separate zone-connection network is an active matrix which can be driven with standard TCON driver chips. A FHD display must add 33% more column driver contacts on the edge of the display, but the zone can have a much higher density. In the demo, each zone now has 600 sub-pixels per pixel instead of 3. Sub-pixel addressing is embedded in the video signal passed through the TCON. The tradeoff is that you cannot update all subpixels individually. Instead, you must select which subpixels to update when you scan out an image. For example, one can scan out 6 different time multiplexed images directed to six different tracked eyeballs at 360 fps, and each eyeball will experience a separate perspective image at a frame rate of 60 fps. The 12 x 6 mm proof-of-concept is on silicon and not glass. It has two such segments with a total of 128 pixels with 76.8um x 7um wide subpixels using only 40 data lines and 16 scan lines.



share this post