HMI and hot wheels

The most crucial technology to enabling autonomous driving is the human machine interface (HMI), says Bryce Johnstone

While there is still a human driver behind the wheel of a car they will be presented with increasing amounts of information. This must be timely, unambiguous and clear, so that the driver knows exactly what the car is about to do on their behalf, and what the car is expecting of the driver in the case of taking back control of the vehicle, such as when they are about to leave a geocached area for a motorway.

Is it a car? Is it an office?

The car of the future will look very different by the time it becomes a fully autonomous, publically-available vehicle. It could morph into an office, a living room, a place to rest and a general entertainment centre for occupants on their journeys.
While the transformation from today’s layout to a more radical design with no steering‑wheel and with seats facing the interior of the vehicle will take time, we can already see the trends.

It is likely that cars will have multiple screens, greatly increasing the need for powerful and power-efficient graphic processor units (GPUs) that can drive a large number of pixels as well as supporting new features such as augmented reality (AR), gesture control and an advanced human-machine interface (HMI).
Among all the available interfaces, head-up displays (HUDs), which are standard today in aircraft, are increasingly found in cars to ensure that the driver is focused on the road.

In a HUD information is projected on to either the windshield or a dedicated screen, giving an infinite depth of field so that the driver sees the information in their view without having to refocus their eyes as they do using traditional dashboards.

HUDs will become much richer and more complex, with windshield‑based split-screens divided into passenger and driver portions which can only be viewed by the person in the appropriate seat. Technologies such as gaze‑tracking will be used to place relevant information at the centre of the driver’s line of vision.

At the core of this is GPU technology, which will not only be pushing pixels to the screen on the dashboard, but placing them on the HUD unit in the dashboard.
Gaze direction can also be used to determine whether the driver is paying attention to the road, where algorithm tuning on the GPU can warn the driver if they are not fully concentrating.

HMI in autonomous vehicles

Today, we are between self-driving capabilities Level 1 and Level 2, where Level 2 is a ‘driver-hands-on’ level. The range of features on a car provide information about advanced driver assistance systems (ADAS) and some actuation functions, where the car takes control in the case of potential accident, such as automatic braking in an emergency.

Audi recently introduced its A8 with Level 3 driverless capabilities, and other car manufacturers, such as Volvo and Ford, have committed to that milestone by 2020. At Level 3, drivers will still be needed to take over if an issue arises, but can shift safety-critical functions to the vehicle under certain traffic or environmental conditions.
In Level 2 and Level 3, driving control needs to be passed back to the driver and the driver will need to be notified about actions the car may take on their behalf. Drivers are still not used to being in a car that does things they have not instigated, so this change needs to be managed carefully with clear and timely updates.

Technology transition

Our current largely visual mode of interaction with the dashboard will increasingly be enhanced by voice interaction, as well as audio responses, vibration alerts and visual warnings projected on to an HUD. HMIs will play an important role in helping users adjust to completely autonomous vehicles. During the transition, people will have to learn to trust driverless cars.

A passenger should at any time be aware of what is happening in the car: why it chooses a particular lane, which cars are close by, how busy the road is, how the route is being calculated and much more.

The HMI will be the technology that communicates this information, so a well-designed HMI, incorporating graphics and audio elements, will be fundamental to the acceptance of autonomous vehicles.

There is a question mark about what the HMI will evolve into when the industry reaches Level 5 and vehicles become driverless. Will it become a turbo‑charged infotainment system, since no driver‑related information will need to be shared once the car can manage itself?

What can AR bring?

Some companies have proposed using what is currently the front windscreen for AR HUD projections. Smaller unobtrusive versions already exist, but future implementations have the potential to be all‑encompassing.

The AR view would be split across the windscreen, with a simplified driver view to reduce distractions by only delivering vital information to the driver. On the passenger side, much richer content could be displayed, such as the location of restaurants or shops, the nearest parking opportunity or points of interest.

Another advantage with an AR HUD is that the gaze direction can be inferred by in-car cameras so that key driving information can be projected into the centre of the gaze, given a limited angle of head movement.

Underlying all these HMI opportunities is a set of graphics and GPU compute processing. Current dashboards and infotainment systems are largely based on multiple silicon devices to support each function. Increasingly there is a move to centralise these functions on to a single GPU that can drive combined infotainment and navigation systems, along with the digital cluster and AR HUDs.

Imagination’s GPUs can fully support these requirements, leveraging the company’s hardware‑backed virtualisation to ensure full separation between each of the processes running on a hypervisor. Using such GPUs, a system developer can be confident that the cluster, HUD, infotainment system and navigation systems are each running in separate containers. In this way, if there is an issue with the infotainment system and it fails, for example, this will have no effect on the other elements and it can be rebooted without affecting anything else.

Cars are going through a fundamental change in their capabilities, with the transition of the tasks of driving and sensing transferring from the driver to the car. During this period, the HMI will be fundamental to maintaining safe driving in an environment of mixed non-autonomous and fully autonomous cars. When we finally arrive in a fully autonomous environment, the HMI will play a different, but no less important role, providing a fully interactive, information-rich and entertaining experience for passengers.

Bryce Johnstone is automotive segment director at Imagination Technologies


Leave a Reply

Your email address will not be published. Required fields are marked *

*