The role of Advanced Driver Assistance Systems (ADAS) systems in automobiles is to prevent deaths and injuries by reducing the number of car accidents and the serious impact of those that cannot be avoided. Advanced ADAS systems will be needed to enable semi- and fully-automated automobiles as well. Today, ADAS functionality is visible in small ways, but in the future may become more visible and viewable on a variety of display platforms. Such platforms could include smartphones, AR glasses, Head-up Displays, console or entertainment displays, and even transparent window displays.
Many new cars offer ADAS functionality such as blind spot monitoring, automatic emergency braking, pedestrian detection, adaptive cruise control, traffic sign recognition, lane departure warnings, and lane keeping assistance. They are enabled by a variety of sensors placed in the car that can include ultrasound, LIDAR, RADAR, and many cameras. All this sensor data must be processed in real time to generate warnings, images and actions.
For example, drifting out of your lane may result in the steering wheel vibration or a visible warning in the driver instrument cluster or Head-up display (HUD). Blind spot monitoring usually manifests as a visible warning in the driver’s side mirror.
As these ADAS systems become more sophisticated, they will be able to better monitor pedestrians, bicyclists, and other cars while providing improved navigation instructions, along foul weather and night time driving assistance. Many envision such information being presented in Augmented Reality HUDs that will offer wider fields of view and virtual symbology placed at greater distances than current HUD systems. This is indeed a very hot area of development currently.
But AR-HUDs are not the only platform that drivers and passengers may use to access advanced ADAS information. For example, some have suggested that AR glasses could also be used to access such information. For the driver, AR glasses would help solve the eyebox delivery challenge of built-in HUDs and would allow the driver to use them in any car that could interface ADAS information with the glasses. They might also provide a wider field of view and cost less than a built-in HUD. But they will also raise safety concerns from the FDA.
As more advanced ADAS systems arrive and automated driving becomes more common, visualizing what the ADAS system is detecting will become more important too. For example, knowing that the ADAS system has detected a person or dog in the road will be important information to the driver and passengers. As a result, such information may want to be visualized on smartphones, center console displays, or rear-seat entertainment displays in addition to the driver’s HUD.
If such scenarios are to be realized, Tier 1 suppliers and auto makers may need to rethink interfaces to the ADAS visualization symbology. That is, the symbology may need to be available on the HUD and other car displays in addition to portable devices like smartphones and AR glasses. Are developers thinking about providing such data in wired and wireless formats? I don’t know so drop me a line if you have thoughts about this. Chris@insightmedia.info