Affectiva Launches A.I. Tech to Help Cars Sense Your Emotions

Efforts to develop self-driving vehicles have largely focused on tracking what’s going on outside the cars—think laser-based sensors to track other vehicles and digital mapping technologies to help navigate.

Now, the industry is turning some of its attention to technologies that sense what’s going on inside the vehicle. An initial goal is to better monitor driver alertness to help reduce the number of car accidents. But if fully autonomous vehicles one day become the norm, having technology that can understand the mood and preferences of passengers might enable the vehicle to automatically make adjustments that improve the riding experience. The jury is still out on whether vehicle occupants will prefer that to controlling changes themselves, but companies are trying to develop the technological capabilities anyway.

“All of our technologies and our devices are becoming conversational and perceptual—even our car,” says Rana el Kaliouby (pictured above), co-founder and CEO of Affectiva.

Her Boston-based startup is one of the companies trying to bring that vision to life. Since 2009, the MIT Media Lab spinout—backed by more than $26 million from investors such as Kleiner Perkins Caufield Byers and Horizon Ventures—has been developing technologies and products aimed at sensing people’s emotions. It initially took a vision-based approach, using webcams and optical sensors to analyze people’s facial expressions and non-verbal cues, for applications in areas such as advertising, marketing, and video games. Last fall, the company expanded into voice analysis with the release of a cloud-based software product that it says can measure certain basic emotions in speech.

Over the past year or so, automakers, vehicle suppliers, and automotive tech startups began expressing growing interest in Affectiva’s products, el Kaliouby says. In their push for vehicle automation, those companies have mainly focused on outdoor sensing technologies, she says. But now, there’s an “increased understanding that you also have these occupants, drivers, and co-pilots in the vehicle that we need to understand as well,” she adds.

Today, el Kaliouby’s company publicly announced the launch of “Affectiva Automotive AI,” a software product that uses artificial intelligence technologies to measure the “emotional and cognitive states” of vehicle occupants in real time. The software crunches data fed to it by cameras and microphones installed in cars. (Those devices aren’t made by Affectiva.) The company says its software can measure facial expressions that indicate joy, anger, and surprise; vocal expressions of anger, laughter, and excitement; and signs of drowsiness, such as yawning, eye closure, and blink rates.

Affectiva’s new product moves the company into the autonomous vehicles sector, an industry receiving a lot of interest and investment. The decision could help the startup differentiate its business from competitors in emotion-sensing technology, which include EMOSpeech and Vokaturi. But Affectiva isn’t the only company in this sector going after the automotive market—Palo Alto, CA-based Eyeris is also developing “vision A.I.” products for monitoring drivers and improving the riding experience. Eyeris doesn’t appear to be tracking vocal expressions, at least according to its website.

El Kaliouby will give a talk about Affectiva’s automotive ambitions on April 12 at Xconomy’s Robo Madness event in Bedford, MA, but she shared some initial details this week in an interview.

The starting point for the new product was what Affectiva learned over the past few years from the 6 million faces its software has analyzed, mostly while people watched online videos, el Kaliouby says. To adapt its technology for in-vehicle sensing, the company has paid about 200 people so far to install cameras and microphones in their cars to collect data while they drive. The company has amassed data from about 2,100 hours of driving, which has helped Affectiva train its machine learning algorithms to recognize things like yawns and to track blinking patterns—things that its software wasn’t previously capable of handling, she says.

The first application of the technology is to watch for distracted or drowsy drivers—two significant causes of vehicle accidents. And, as more semi-autonomous vehicles hit the road, el Kaliouby argues that an “in-cabin sensing solution” for driver alertness will become as crucial as seat belts.

“The [semi-autonomous] car is expecting to be driving for the majority of the time, but in some cases, it might encounter scenarios where it does not know what to do and needs to hand control back to the human driver,” she says. “It really needs to know if the driver is awake or not, if the driver is paying attention or not. You might be in the middle of watching a movie or reading a book, but all of a sudden you need to have full context and take control again. … That trust back and forth between the driver and a car is really, really important.”

If Affectiva’s software determines the driver isn’t paying attention, it might trigger some kind of alert, el Kaliouby says. In more extreme cases, perhaps the car could be programmed to park itself and call for help, she says.

Some car companies have already begun introducing cameras to monitor driver alertness, el Kaliouby says. But she claims they’re mostly using “rudimentary eye-tracking” technologies designed only to determine if the eyes are on the road or not.

Affectiva has already signed up unnamed car companies as paying customers, el Kaliouby says. It also has partnerships with Autoliv, a Swedish supplier of automotive safety products, and Renovo, a California-based developer of software for autonomous vehicle fleets.

Affectiva’s technology could also be used to improve the safety of private rides enabled by apps like Uber and Lyft, el Kaliouby says. If a driver gets angry with a rider, or vice versa, the software could be configured to flag those situations. And in the future, she suggests, the software could become capable of detecting whether passengers are intoxicated, which could put them “more at risk of being in danger,” whether from the driver or other riders, she says. (One would assume the tech could detect a drunk driver, too.)

When asked what the software might do if it detects a dangerous situation in the vehicle, el Kaliouby says that

Author: Jeff Bauter Engel

Jeff, a former Xconomy editor, joined Xconomy from The Milwaukee Business Journal, where he covered manufacturing and technology and wrote about companies including Johnson Controls, Harley-Davidson and MillerCoors. He previously worked as the business and healthcare reporter for the Marshfield News-Herald in central Wisconsin. He graduated from Marquette University with a bachelor degree in journalism and Spanish. At Marquette he was an award-winning reporter and editor with The Marquette Tribune, the student newspaper. During college he also was a reporter intern for the Muskegon Chronicle and Grand Rapids Press in west Michigan.