SoftBank Taps Affectiva to Boost Pepper Robot’s Emotional IQ

Teaching robots to recognize human emotions and react appropriately is one of the grandest ambitions in robotics. Companies have made progress on this front in recent years, thanks in part to advances in machine learning, computer vision, speech recognition, and related technologies. But there is a long way to go.

Now, one of the big players in the field, SoftBank Robotics, is turning to a Boston startup to boost the “emotional intelligence” of its humanoid robot, Pepper. SoftBank Robotics said Tuesday it has teamed up with Affectiva to integrate the startup’s “emotion A.I.” software into Pepper to improve the robot’s ability to interpret people’s emotions and adapt its behavior in real time.

Financial terms of the partnership weren’t disclosed. The deal highlights the ambitions of SoftBank Robotics’ parent company, the Japanese tech giant SoftBank Group, to make advances in nearly every aspect of robotics, from computing to A.I. to the machines themselves. SoftBank’s other relevant deals in recent years include the acquisitions of robotics firms Boston Dynamics and Schaft from Google’s parent company Alphabet, the purchase of chipmaker Arm, and investments in companies such as Fetch Robotics.

Pepper was launched in 2015 by SoftBank and Aldebaran Robotics, a France-based humanoid robot developer that has been majority-owned by SoftBank since 2012. (Aldebaran was later renamed SoftBank Robotics.) More than 12,000 Pepper robots have been sold, Forbes reported in May. The machines are serving as companions to nursing home residents, as well as greeters in retail stores, banks, and hotels. Pepper has even been tapped to chant Buddhist sutras in funerals, in place of a human priest.

Since it was first announced in 2014, Pepper has been billed as having the ability to read human emotions. According to SoftBank Robotics’ website, Pepper’s cameras, microphones, and software enable it to perceive smiles, frowns, tone of voice, and body language, such as the tilt of a person’s head. The robot then reacts accordingly, perhaps by acting delighted if it thinks the person is happy or trying to comfort someone if he or she seems sad.

In a press release, SoftBank Robotics said Affectiva’s software will expand Pepper’s emotion-detection capabilities. Affectiva says its technology can identify basic emotions such as joy, anger, and surprise, as well as “more complex cognitive states and expressions such as distraction, drowsiness, or differentiating between a smile and a smirk.”

“There’s a significant opportunity for robots like Pepper to improve the way we work and live, as we’ve seen through the many roles Pepper has already taken on as a companion and a concierge,” said Marine Chamoux, an affective computing roboticist with SoftBank Robotics, in a prepared statement. “But this is only the beginning—especially as Pepper continues to evolve and learns to relate to people in increasingly meaningful ways.”

The partnership fits into a broader shift in the way humans and machines interact, as we move beyond typing commands on keyboards. Internet-connected speakers from Amazon, Apple, Google, and others react to voice commands. Pepper, Jibo, and other robots have vocal and facial recognition skills. These capabilities are far from perfect. But as robots become more sophisticated and commonplace, companies such as Affectiva and Eyeris are attempting to make machines’ interactions with people more natural and human-like.

Affectiva has been at this since 2009, when it spun out of the MIT Media Lab. The company initially focused on a vision-based approach, using webcams and optical sensors to analyze people’s facial expressions and non-verbal cues, for applications in areas such as advertising, marketing, and video games. Recently, Affectiva expanded into voice analysis and started marketing its technology to the automotive industry. The company has raised more than $26 million from investors such as Kleiner Perkins Caufield Byers and Horizon Ventures.

“As robots take on increasingly interactive roles with humans in many corners of society—spanning healthcare, retail, and even entering our homes—there’s a critical need for us to foster a deeper understanding and mutual trust between people and robots,” said Rana el Kaliouby, Affectiva’s co-founder and CEO, in a prepared statement. “Just as people interact with one another based on social and emotional cues, robots need to have that same social awareness in order to truly be effective as coworkers or companions.”

[Top images of Pepper robots combined by Xconomy. Images copyright Philippe Dureuil/Toma, downloaded from SoftBank’s press gallery on its website.]

Author: Jeff Bauter Engel

Jeff, a former Xconomy editor, joined Xconomy from The Milwaukee Business Journal, where he covered manufacturing and technology and wrote about companies including Johnson Controls, Harley-Davidson and MillerCoors. He previously worked as the business and healthcare reporter for the Marshfield News-Herald in central Wisconsin. He graduated from Marquette University with a bachelor degree in journalism and Spanish. At Marquette he was an award-winning reporter and editor with The Marquette Tribune, the student newspaper. During college he also was a reporter intern for the Muskegon Chronicle and Grand Rapids Press in west Michigan.