With over 4.5 million faces analyzed in 75 different countries, Affectiva has risen out of the MIT Media Lab to gather the largest emotion database in the world.
The Waltham-based company has built software that can analyze and understand human emotions through facial expressions, which are recorded via device cameras and standard webcams.
We first connected with Affectiva’s Chief Marketing Officer Gabi Zijderveld at a Boston Neuromarketing Meetup – where, earlier this year, our very own Nancy Harhut volunteered to participate in a demo of the technology.
Given our own interest in how emotion affects consumers’ decision-making process, it seemed like an obvious opportunity to reach out to Gabi, schedule a phone call and deepen our understanding of this subject matter. Below, you’ll find a transcript of our conversation.
Tell us about the overall mission of Affectiva.
Our larger mission is to bring emotional intelligence to the digital world. We live in a hyper-connected world full of smart devices, intelligent technology and advanced AI systems. A lot of human communications take place online, and as we like to say, there’s lots of IQ in that, but very little EQ.
The technology that surrounds us is not aware of our emotions and does not adapt to the emotions that we can sense now. We’re determined to change that, to better how we as humans interact with our technology – to really humanize technology, but to also improve how we as humans communicate with each other in a digital world.
What are some key ways that marketers can use this technology?
It can be used to get emotion insights and analytics. In the context of advertising, our technology can measure how consumers respond to an ad or online video, and it can give marketers, market researchers and advertisers valuable insight on how these consumers engage with their digital content.
It helps them optimize their ads, determine how to allocate their media spends and predict KPIs in advertising such as brand recall, purchase intent, sales lift, virality and likelihood to share over social media. So our emotion data can help in that area of insights and analytics. That’s one area.
Additionally, we can drive real-time interaction based on emotion. We’ve packaged our software in a variety of SDKs, software development kits, which means we can license our technology to developers. Anyone who’s building a solution – no matter what kind of vertical or market they’re in – can license our technology and integrate it into their own work and emotion-enable their apps, their digital experiences, their devices, their applications.
You’ve touched upon this in your previous answer, but why is it important for marketers to be aware of emotion?
Research has shown that the emotional centers of a human’s brain kick in first. When we’re viewing or interacting with content, the emotional centers typically kick in first, and they potentially have great influence on the subsequent cognitive reaction.
If someone is looking at an ad or a brand message, and they perceive it to be boring and uninteresting, this can also carry over to the overall cognitive reaction to the brand.
Emotions influence all aspects of our lives – how we live, work and play – but especially the decisions we make, big and small. Whether it’s what you’ll eat for breakfast or what kind of house you’re going to buy, emotions play a role in those choices. Engaging consumers on an emotional level is key.
Are there any particular emotional triggers that have consistently engaged viewers and driven further action?
Based on what we’ve seen – and I don’t think I’m sharing anything earth-shattering here – it’s humor, surprise, joy and the deeply emotional, gut-wrenching, content that tugs at the heart, whether it’s in a sad way or an inspirational, uplifting manner.
The one thing we’ve noticed is, there’s no cookie cutter answer to this question. There’s no consistent method that works. If there were, advertising would be super easy, because then there would be a formula that people could apply.
So at the end of the day, it’s really in the power of the creative, and that’s the nature of creative, right? There’s no formula, set approach or methodology to it, and very often, it’s a combination of things.
Could emotional awareness help chatbots become better at listening and responding to customer feedback?
Oh, yes, absolutely. Just earlier this month, we wrote a blog post about this topic. Any technology that interacts with, delivers insights or provides a service to humans should really understand the emotions of the humans they interact with and adapt accordingly.
Today’s bots, for the most part, are not really doing that. We believe that our technology could add tremendous value to bots by emotion-enabling them. Any advanced AI system that is designed to interact with humans needs to also factor in human emotions. That’s not happening today, and we’re on a mission to change that.
How did Affectiva arrive at the 7 emotions and 15 facial expressions that guide your approach?
They’re basically the 7 basic emotions, as identified in academic research. The 15 facial expressions that we focus on – and there are many more out there, of course – are the most common ones that we see and what we’ve seen our clients most commonly request.
There are many more that we’re constantly adding to the mix. Our science team continues to add more facial expressions to the mix, and then there are different combinations of facial expressions that make up certain emotions.
As a company, you’ve collected massive amounts of data…
Over the past several years, we’ve analyzed approximately 22,000 pieces of video content and more than 4.5 million faces in 75 different countries. More than 1,400 brands have used our technology. We’ve gathered the largest emotion database in the world, and every single day, we’re adding more to the mix.
What’s important to know is that those 4.5 million faces have been captured, if you will, in the wild. These are people looking at stuff and being analyzed wherever they sit at that point in time, not people who are brought into a lab where they’re in optimal lighting conditions, sitting perfectly in front of a camera. This in-the-wild data is key if you’re designing human-to-technology interactions, because the audience you’re trying to reach will be in the wild, as well.
If people know their facial expressions are being monitored, how are we able to trust that they won’t alter them to fit (or defy) an “expected” behavior?
Our research shows that this rarely happens. At the end of the day, there’s no incentive for people to alter their emotional expression, and because we have massive amounts of data, we can actually pick up on the nuances or microexpressions really well.
For example? Our smile detection algorithm can detect a fake or a polite smile from a genuine smile.
Are there any gender or age differences when reading our emotions and expressions?
Very much so! Research has shown that human emotions are universal around the world, but it really does differ by gender, age and culture.
Our data has shown that women are way more expressive than men by about 30 percent. Women smile more, and they smile longer, as well. However, these gender differences also vary by culture.
In the US, women smile about 40 percent more than men. In France, it’s 25 percent, but based on our data in the UK, it’s the same. Very odd. We don’t know why, since we’re not doing academic research here. This is just what we observed.
We’ve also seen that older people are much more emotive than younger people. Personally, when I saw that stat, I was a little bit surprised. I thought it would be the inverse based on my own assumptions, but people over the age of 50 are about 25 percent more emotive than people under the age of 50.
What other ways does culture play into this?
When we tested our technology in group settings – let’s say a focus group – we see that with collectivist cultures, people tend to dampen their emotions. Maybe this is because it’s more of a group culture, and they are not inclined to drop out from the pack.
Whereas, in the US, people emote much stronger in group settings, and maybe this is because in individualistic cultures, people want to stand out from the pack.
But then, the results are reversed when people do tests in the privacy of their homes. In collectivist cultures, our technology sees them emote very strongly. In individualistic cultures, people are much less expressive at home than they are in group settings.
Describe, if you can, a few specific ways that marketers have implemented Affectiva’s technology
For media and advertising work, we have norms that allow people to benchmark the performance of their ad or creative, in order to compare it to peers and competitors. With norms, you can compare content by geography, product category or media length. There are a lot of interesting insights there, and these norms are quite industry-unique.
Now, more and more creative agencies are beginning to use our technology to build really unique experiences for their clients. There’s an interesting prototype out there for the Hershey company called the Hershey Smile Sampler. People can step up to this kiosk, and if they smile, our technology’s crunching in the background and dispenses a Hershey chocolate. So that’s a unique retail experience designed to build positive brand sentiment for Hershey, but also, to drive more traffic down the confection aisle, which retailers of course love.
Then there was the TrueCar L.E.D., a 3D installation made for a concert out of 70,000 LED lights. Concertgoers could submit selfies to an app, and then married with crowd density software, the LED display would fire up based on the emotions of the crowd.
Also, there’s the Bentley Inspirator, an interactive advertising and marketing campaign. They call it a “luxury commissioning experience”. You watch this highbrow lifestyle video, and based on your emotional responses, the narrative adapts in real-time. In the end, it comes out with a personalized recommendation of the car specification – what kind of wheels, what kind of color, what kind of interior.
I think the way that brands choose to engage with clients and consumers will really change, and we’re going to see a lot more of these interactive and truly engaging experiences. Our technology, I think, will be at the core of that.
Are there any other industries where you see big opportunities for emotion insights and analytics?
Two that really jump out at me are healthcare and online education.
With healthcare, it’s a number of scenarios. In telemedicine, healthcare providers could get real-time feedback on the emotional state of the patient, but also, patients themselves could gather continuous, longitudinal emotion data and choose to send that to their healthcare provider – especially people dealing with mental illness or people on medication that could affect mental state.
You can extend it to mental health monitoring, where patients could establish their own personal baseline, and if they deviate from their own norm, it could – with their consent, of course – flag their doctor and help monitor that. It’s also mindfulness and mood tracing. We have fitness devices that track our physical fitness, but very little that does our mental wellness checking.
While pain assessment is typically done with self-reported rating scales, one could also measure pain off the face. There are a number of use cases in healthcare that we think could be really beneficial, but to the best of our knowledge, no real commercial solutions out there yet.
And as for online education?
It’ll be tremendously beneficial if these online learning systems could detect the emotional engagement of the student. For example, if someone is bored, confused or disengaged, you could find ways to adapt the content to the individual in real-time.
These days, in online education, outcomes and results are measured by tests after the fact. If something goes wrong, you’re a little too late already… But what if you could intervene? You could get trending data and establish norms for individuals and really personalize education. We think, in the future, there’s tremendous potential for that.
Could this be applied in a B2B setting, perhaps for a webinar or online tutorial?
Yes, in both settings, we have presenters speaking to dozens, hundreds or thousands of people. Wouldn’t it be cool if they could get a real-time mood meter of the audience? We have created some really interesting demos in-house, even with Facebook Live, where we build a little mood meter for live video streams.
Think about Twitch or online gaming – eSports is a huge market. If you were big into online gaming, you could see in real-time which game was generating the most interesting emotion from the audience, then that might be the one that you want to watch.
Or even with YouTube, right? You could have a personalized recommendation based on emotion, so if you liked and responded to a certain type of video, then the system would recommend these other videos.
It would be pretty mind-blowing to skew away from vanity metrics like “most viewed” in favor of, instead, highlighting the “most engaging”.
Or the funniest video based on smiles, and not just because people say it’s funny. We’re certainly pitching these concepts, so some of that could be truly game-changing.
What do you envision for the future of this technology?
We really do have a vision that, one day – and that future is not even that far away, within 3 to 5 years or so – we will live in a world where technology is enriched with emotion. It’s what we call emotion AI, that all these advanced systems that surround us are emotion-aware and that perhaps even our devices will have an emotion chip.
So today, all of our devices have a GPS locator, and one day we believe there will be an emotion chip. Anywhere that there’s technology that interacts with humans, this technology will be emotion-aware, from social robotics to healthcare, education to gaming, mobile to automotive – you know, the cars that we drive in or the autonomous cars that we sit in – advertising, marketing, retail. We believe that all will be emotion-aware.
In your experience, has anyone ever expressed apprehensions or privacy-related concerns toward this future?
That’s perfectly legit, right? I’m sure people will not want it, and in that case, the option needs to be available for people to opt out from it. When you go to an app, it asks you for your location or to send notifications. Sometimes you say “yes”, and sometimes you say “no”. You opt out of it, and this is a very individual choice.
That’s what I, as a consumer, would want to see. I wouldn’t want the camera on my device turning on to start analyzing my face without me allowing my device or app to do that. I think, we as a provider of this technology, need to continually look for the value it adds to people’s lives, and if people perceive that value, they will choose to opt in.
Want to gain firsthand experience in how Affectiva’s emotion recognition technology works? Download their demo app AffdexMe, now available for free in the AppStore and GooglePlay.
Developers interested in adding Affectiva’s technology to their apps or digital experiences can gain access to the Affdex SDK here.
For additional case studies, research and insights related to emotion, we encourage you to visit Affectiva‘s website.
The post An Interview With Gabi Zijderveld: Envisioning A Future Of Emotion-Aware Technology appeared first on Wilde Agency.
