In-Car Assistants Bring Voice AI to the Masses

in-car-assistants-bring-voice-ai-to-the-masses

Despite the significant progress being made with autonomous vehicles, it will be at least a decade before driverless passenger cars are produced in large volumes, and likely decades before traditional cars disappear from the roads. That is why automotive original equipment manufacturers (OEMs), suppliers, and technology vendors are focused on the development and deployment of in-car AI technology, which provides drivers with a safe and intuitive way to interact with the vehicle’s operational and infotainment systems.

Improving Voice Activation Can Improve Safety

In-home voice-activated personal assistants have taken off in popularity among consumers, largely because it is possible to control devices and get information simply by speaking commands in a natural, conversational way. It is this type of functionality, however, that is perhaps even better suited in a vehicle, where a driver should have his or her eyes focused on the road and hands on the wheel.

However, traditional voice-controlled assistants in cars have not always functioned well, due to the large amount of ambient noise (road noise, wind noise, and noise from entertainment systems or other people talking), making it difficult for systems to accurately understand commands. In recent years, the enhancements in sound recognition processing, combined with the use of better natural language processing accuracy, has led to the increasing use of voice-driven assistants in cars. Furthermore, many assistants also allow the OEM or supplier to customize the user experience and add supported features, even after the launch of the vehicle.

Automakers Are Recognizing and Implementing Voice Assistant Technology

The biggest catalyst has been the widespread success of in-home, voice-activated assistants, such as Amazon’s Alexa, Apple’s Siri, and Google’s Assistant, among others, which have introduced natural language-based voice control to consumers. Indeed, automakers, seeing the popularity of these home-based assistants take off, are now actively incorporating Alexa, Google, or other voice-controlled assistants into vehicles, with a number of automakers featuring the technology in 2018 model year vehicles. For example, Toyota is integrating Amazon Alexa into both Toyota and Lexus 2018/2019 model vehicles, allowing drivers or passengers to ask for directions, check the news, control connected smart home devices remotely, and control in-car entertainment options, all via the Toyota Entune 3.0 App Suite and Lexus Enforma App Suite 2.0, respectively.

Models that use Toyota Entune 3.0 include the 2018 Toyota Camry and 2018 Toyota Sienna minivan, the 2019 Toyota Corolla Hatchback, 2019 Toyota Avalon, and 2019 Toyota C-HR. Lexus models using the Enforma App Suite 2.0 include the 2018 Lexus NX, LC, RC, RC F, and LS.

Toyota is far from the only automotive OEM to disclose plans to incorporate voice-driven assistants into its vehicles. Ford offers Amazon Alexa in a number of its 2018 vehicles, such as the Ford Focus Electric, Fusion Energi, and C-Max Energi, allowing users to access Alexa voice control services via its SYNC interface.

BMW is working with the Alexa team at Amazon to integrate the voice-based service into cars produced after March 2018, and will also integrate visual responses from Alexa and show them on the display above the center console of the car.

It is not just Alexa that is being deployed in the car. BMW also uses technology from Nuance Communications to provide AI-based automotive assistant technology known as Dragon Drive, which is already available in 2018 models from BMW, and allows highly personalized, voice activated services available through the car’s infotainment console.

BMW also announced it had struck a deal for research and development (R&D) with IBM, through which IBM’s Watson sensors and computing will be integrated into BMW cars to pick up car data and help with new systems aimed at vehicles running more efficiently, as well as providing more personalization features for drivers and passengers. BMW and IBM noted that that such services and technology will be used to deploy intelligent diagnostics and services to allow vehicle-to-vehicle (V2V) communications.

Honda, Hyundai, and all GM brands announced at CES in January 2018 that Android Auto will be able to offer Google Assistant functionality, permitting users to control a smart home remotely from the car, as well as remotely check fuel levels, lock the vehicle, and even order a Starbucks coffee. Feature sets and functionality are ultimately set by each OEM, and likely will vary by line. Honda also unveiled its New Electric Urban Vehicle (NeuV), which is loaded with a personal AI assistant, called the Honda Automated Network Assistant (HANA). HANA is able to read the driver’s emotions and adjust the music, temperature, and more in response, and it can remind the vehicle owner of important dates and help make reservations and appointments.

General Motors also has partnered with IBM to add Big Blue’s AI technology to its cars. IBM’s Watson will be used to augment GM’s OnStar service, with Watson crunching data on users’ habits to deliver personalized services, largely related to adjusting entertainment preferences, directing drivers to services like restaurants or gas stations, or making in-car payments. The system is also capable of serving up assistants, which learn users’ daily habits, such as what time a driver leaves for work, or tying into a smartphone account to remind the user to pick up groceries on the way home, so that a separate trip is not necessary.

Mercedes-Benz is partnering with NVIDIA to bring its AI solution to the car. The Mercedes-Benz User Experience (MBUX) is designed to save and suggest favorite music or destinations automatically, and can understand colloquial expressions across 23 different languages. The system will also use AI to track eye and head movement and arm gestures, allowing the vehicle to understand if the driver is not looking at a pedestrian in the street.

Meanwhile, Kia announced at CES that it has partnered with Google to make select 2018 Kia cars accessible through all Google Assistant apps, allowing drivers to control their engines, electric charging station, headlights, horn, and locks, all while miles from the vehicle.

All of this activity will result in strong revenue growth for personalized information in vehicles, which includes entertainment selections based on the driver’s emotional state, or specialized commercial services tied to behavioral algorithms. Tractica’s recent report, Artificial Intelligence for Automotive Applications, forecasts that total 2025 revenue for these services will reach $624.9 million by 2025, up from $11.2 million in 2017. Additional market segmentation details are available in the report.

Vehicle OEMs Should Differentiate Offerings to Remain Competitive

Ultimately, some vehicle OEMs are not simply looking to integrate other assistant technology into the vehicle, as it does not provide significant differentiation between their offerings and their competitors’ offerings. For example, Ford is working with Qualcomm on plans to leverage a cloud network of connected vehicles that would provide data on traffic, weather, parking, and other travel ecosystem data, which would then be sent back to users’ smartphones, allowing them to select the fastest routes, get local info like available parking spots, and access other real-time data.

Furthermore, integration and user interface issues often arise when OEMs are trying to incorporate existing assistant technology, particularly if they are seeking to fully integrate and brand the voice assistant experience (saying “Alexa” does not reinforce an automotive OEM brand).

Perhaps foretelling the future of auto-based assistants, OEM supplier Bosch’s voice assistant, Casey, can recognize commands in 30 different languages and is trained to understand natural speech patterns. All processing takes place onboard the car in a head unit, and if an internet connection is available, additional location, traffic, weather, and other data can provide up-to-date information. The system uses AI to observe the driver, and then learn his or her habits that can be applied to future scenarios. Perhaps most interestingly, the system can be renamed, allowing drivers to finally name a car, and have it respond to their commands.

Comments are closed.