By clicking “Accept All Cookies,” you agree to the storing of cookies on your device to enhance site navigation and analyze site usage.

Skip to main content

The Emotional Face of Machine Learning

May 30, 2017



By Paula Klein

In the evolution to humanize technology, Affectiva is carving a niche.

Its software development kit (SDK) and cloud-based API allow developers to enrich digital experiences by adding “emotion awareness” to apps from games to medical devices. And that means that machines can collect data and respond to users’ emotions in real time, mostly based on facial recognition techniques. It’s what the company calls, Emotion AI.

As noted in a recent Forbes article: “Affectiva’s technology has proven transformative for industries like automotive, market research, robotics, education, and gaming, but also for use cases like teaching autistic children emotion recognition and nonverbal social cues.”

Demand is flourishing: Fortune 500 brands such as Kellogg’s want a more data-driven way to test advertising, consumer behavior, and marketing campaigns beyond the limitations of focus groups or surveys. Affectiva’s technology can collect, store and analyze the reaction of viewers and measure their facial responses: Did they frown, smile, yawn, or wander? Autonomous car companies are interested in determining driver moods and attentiveness for conversational dashboards and to improve safety.

To address these requirements, Affectiva has been partnering with robotics, AI, and marketing companies to augment its 30-person staff.

Burgeoning Demand

But it wasn’t like this when Rana el Kaliouby (pictured, above) began her research more than a decade ago as a doctoral student at the University of Cambridge and then as a post-doc at the MIT Media Lab. “I was interested in the application of emotion as it related to autism,” el Kaliouby said last week. Emotion AI was an untapped market, not getting much attention, she said, “It was so new, then. Now, people just say: I need it!” Joining the burgeoning “empathy economy,” is a way for AI companies to distinguish themselves and their products, she said.

El Kaliouby, Co-founder and CEO of Affectiva, based in Boston, returned to MIT to give the keynote address at the IDE annual conference on May 25. She described the company’s progress and the rising demand for facial and voice recognition technologies. By 2009, she said, so many requests for commercial apps were flooding the MIT Media Lab, she was encouraged to spin out the company and seek venture funding. “It’s challenging to work in academia and scale a project,” and el Kaliouby was determined to explore more about how human-machine interactions could be applied to commercial applications.

Affectiva was launched that year, and so far has raised $25 million in funding from leading investors including Kleiner Perkins Caufield Byers, Horizon Ventures, and WPP. The company has been ranked one of the country’s fastest-growing startups, and el Kaliouby has won many accolades including the “Women in Engineering” Hall of Fame and Technology Review’s “Top 35 Innovators Under 35” award.

El Kaliouby is still intrigued by the possibilities of AI and machine learning to identify human emotions. “Emotions influence so much in our personal and social lives, but it’s largely missing in the digital world.”  According to a March KBV Research report, the emotion, detection and recognition market globally is forecasted to grow at a CAGR of 27.4% and anticipated to reach nearly $30 billion by 2022..Competitor, Emotient, was acquired by Apple last year.


Read the full blog on Medium, here.