Tractica’s upcoming Automotive Human-Machine Interfaces report examines the market ecosystem and conditions for artificial intelligence (AI)-based automotive human-machine interface (HMI), including global market trends, drivers, and barriers. It also explores the use cases and AI technologies related to AI-based automotive HMI and provides profiles of key industry players and market forecasts through 2025.
In a previous post, we explored compelling market drivers that point to a time when voice-enabled smart assistants for car controls and infotainment will be used by many, if not most, drivers and passengers. In this blog post, we will look at the key use cases for AI-based automotive HMI and the major market barriers.
Key Use Cases
AI-based HMI technologies have the potential to play a significant role within vehicles. Specific use cases include the following:
- Driver, occupant controls: Solutions are in place or being developed to enable drivers and occupants to use their voices or gestures to operate various control features in a vehicle. These range from locking and unlocking doors and windows controls to seat and climate adjustments.
- Driver, occupant monitoring: Solutions are being developed to provide driver monitoring for safer driving. These solutions use gesture recognition algorithms fed by in-cabin cameras and sensors to understand eye movements as well as body and limb movement and posture to understand alert or distracted states. Emerging solutions using emotion analysis are being explored to determine the emotional state of drivers and then proactively change environmental factors, such as music and temperature, should the driver become agitated. In another case, an occupant monitoring solution is being developed for a ridesharing provider to help identify lost or forgotten personal items.
- Infotainment: Solutions are being developed with voice, gesture, or emotion technologies to manage infotainment systems. Most systems are leveraging voice-enabled intelligent assistants as the primary interface for drivers and passengers with an infotainment system. However, gesture recognition and emotion analysis are also being used, typically along with voice technology. Emotion analysis is being used to predict mood, while gesture is being used some for infotainment controls, but more so to help with navigation and location-based information. In the case of location-based information, a car occupant could, in theory, point to a location to trigger information about the point of interest. 3D augmented reality displays on windscreens will help occupants with navigation and location-based information.
What Happens in Vehicles When We Are All Passengers?
The biggest market barrier for AI-based automotive HMI is the smartphone. Overwhelmingly across the globe, consumers prefer to use their smartphones as smart assistants in their cars. As recently as August 2018, J.D. Power confirmed the preference for smartphones in the U.S. market. Kristin Kolodge, executive director of Driver Interaction & HMI Research, explains: “Most consumers consider phone systems better for navigation and voice recognition – and they’re free. ‘Better and free’ are hard to compete with, so automakers will inevitably have to cede this territory and will be much better served by focusing on areas where they are the exclusive provider – like driver assistance and collision avoidance – and continue to hone those systems.”
Vehicle Replacement Cycle
According a Wolf Street estimate, less than one-third of the vehicles on the road in the U.S. today are 5 years old or newer. Thus, the current annual replacement rate for vehicles is a lot less, most likely between 5% and 10%. For Europe, the average vehicle age ranges from 7 to 13 years, depending on the country. That means embedded AI-based automotive HMI technology will trickle, not flood, into the U.S. and European markets.
For China, the numbers look significantly different. According to the U.S. Department of Commerce, the average age of vehicles on the road in China in 2017 was 4.5 years.
Carmakers and their vendors are faced with delivering a hardware-software product that is currently outliving any piece of technology-infused hardware. Wolf Street estimates that the average age of cars and trucks in operation in the U.S. in 2017 was nearly 12 years. No tech hardware, not even mainframe computers or telecom switches, is designed to last that long. How does one design hardware capable of adapting to software that has not been productized – but might become relevant 8, 9, or 10 years from now? While carmakers and their vendors are reinventing themselves to fuse hardware and software, none of the players in the automotive ecosystem have great software engineering acumen.
For AI-based automotive HMI to succeed, auto OEMs and their ecosystem partners must focus on future-proof, flexible solutions. These solutions must provide desired end results that are increasingly tightly integrated into the vehicle’s functionality.