Eyeris Brings Emotion Recognition to Automobiles

eyeris-brings-emotion-recognition-to-automobiles

At NVIDIA’s recent GPU Technology Conference, we sat down with the CEO of Eyeris Technologies, Modar (JR) Alaoui, to discuss giving automobiles the ability to interpret human emotions.  Eyeris is bringing to market a feature that has not been present in the 130 year history of the automobile. That feature is ability of the car to monitor and register the emotional state of the driver.  The product will be able to record the emotions as shown by the driver’s face, age identification, gender identification, eye tracking, and gaze estimation of drivers and passengers.  “Our goal is to put our software in the back of every camera in the world,” said Alaoui.

Why – To Improve Safety

Monitoring a driver’s emotional state may save them from the pain and expense of being in a vehicle accident.

How – Computer Vision and Deep Learning

A tiny camera mounted in a dashboard and connected to the company’s EmoVu software continually scans the driver’s face, monitoring facial expressions, head position, and how open the driver’s eyes are. It can also detect signs of exhaustion, such as eyes rolling downward or backward, according to Alaoui. The more advanced cars will be able to analyze signs of emotional distraction and respond by taking control away from the driver.

What – EmoVu

The company’s flagship EmoVu software was created using a deep learning algorithm to train a convolutional neural network (CNN) to measure joy, surprise, sadness, disgust, fear, and anger.  According to The Wall Street Journal, Eyeris Technologies’ system is being tested by major automakers.

eyeris

(Source: Eyeris Technologies)

The EmoVu algorithms are designed to run on any device’s embedded system, with a high enough frame rate to operate in real time without needing acceleration architectures. It processes data on a frame-by-frame level and requires only low amount of RAM, ROM, and storage space. This makes EmoVu portable, and the algorithm is optimized for integration into most embedded platforms including microcontrollers (MCUs), systems on a chip (SoCs), and digital signal processors (DSP).  Different operating systems are also supported. In addition, processing can be accelerated using a graphics processing unit (GPU), instead of the slower (for floating point calculations, at least) central processing Unit (CPU).

When – Next Year

Alaoui recently told The Huffington Post that he expects the technology “to reach consumers starting in 2017.”

Market Outlook

According to Tractica’s research, companies like Eyeris Technologies make the automotive industry one of the sectors that are best positioned to leverage artificial intelligence (AI) and deep learning. In our recently published Deep Learning for Enterprise Applications report, we forecast that spending on AI software in the automotive industry will grow from $2.2 million in 2015 to $208 million by 2024.

DLE-15 Automotive chart

Comments are closed.