Scanning the faces that scan the mobile screens

Emotions expressed by facial expressions can diagnose user reactions to mobile ads.

Chat with MarTechBot

There they stare, heads bowed, but not in prayer.

This is the profile of your typical smart phone user, surfing the net, looking for the next thing. As they flip from page to page and scroll up and down, they may experience one of six basic emotions: fear, anger, joy, sadness, disgust and surprise.

If the page view sparks the right emotion, then that viewer could be turned into a lead. But which emotion can do that? Can this be done in a loud, distracting environment (like in real life)? And can you score the interaction for ad effectiveness use it to optimize a campaign?

First, some background

The hypothesis that all humans feel one of six basic emotions was proposed by psychologist Paul Ekman. His work also inspired others working the intersection of psychology and marketing, looking for ways to measure emotional response so they can sharpen their approach to consumers.

Machine learning and AI modeling have been used by various businesses, all taking different approaches to the reading of emotions through human facial expressions. Some of these approaches were limited by technology, requiring the subject to sit in front of a desktop PC, either in a lab or at home, so that the digital camera could scan their faces and calibrate these images with the software, Max Kalehoff, VP of growth and marketing at Realeyes told us.

With people using smartphones, staying still long enough to be calibrated was not going to work.

Dig deeper: You smiled, so we think you like this product

Cue the face

Realeyes built its facial recognition app for mobile on previous work. It’s AI had been trained on close to one billion frames. Those images were then annotated by psychologists in different countries to take account of cultural nuances. The algorithm in turn was trained by using these annotations, Kalehoff explained, yielding over 90% accuracy.

The potential for Realeyes to work on the mobile platform intersects with the explosion of social media, and in this realm the app is agnostic. It does not matter what the user is looking at — TikTok, YouTube, Facebook, Instagram. The Realeyes app is gauging their reaction.

“To (the best) of our knowledge, this is the first time it’s been done,” Kalehoff said “We are answering a demand to provide detection of attention to creatives in a mobile environment.”

To put Realeyes on the smartphone, users have to opt-in, and are then directed to an environment where they can look at some ads. They are told to scroll through some screens, “doing what they normally do,” Kalehoff said. A small app will reside on the phone helping measure visual attention data and clickstream interaction data. “Our definition (of attention) focuses on a stimulus while ignoring all other stimuli,” he said. “The experience for the participants is under three minutes.”

Looking for data in the right places

What Realeyes looks for depends on the media the consumer is viewing. One outcome sought is what they call a “breakthrough.” “Real people try to avoid ads,” Kalehoff noted, so breakthrough occurs when an ad successfully gets someone’s attention despite a naturally distracting environment.

This matters as people “swipe, skip or scroll” past ads to get to content. They will swipe on TikTok, scroll through Facebook or Instagram, or skip in YouTube, Kalehoff observed. Did the ad get through?

Then there is the type of viewing, like Netflix or Hulu, where the consumer’s involvement is passive. Here Realeyes is looking for “in focus reaction.” Is the viewer paying attention to the ad? What are they seeing, second by second, and is that creating a positive or negative impression?

Then there is online shopping, for example on Amazon. Here validating visual data gets a four-question follow-up, testing for brand recognition, ad recall, trust in the brand and likability of an ad.

The simplicity of Realeyes’ approach is that scanning for facial expression will work anywhere with anything. As two-thirds of the digital media spend goes to three or four major platforms, “you only have to go to a few places to get where the attention resides,” Kalehoff said.

Room for improvement

The foundation of Realeyes is the training database that informs the AI of the meaning of a facial expression. Porting the app to the handheld means being able to spot smiles and frowns, then using that information to correct a bad impression or improve on a good one.

Still Realeyes is aware there is room for improvement. It has had to work on adjusting its face-reading app to work in low-light conditions while remaining accurate, Kalehoff pointed out. The AI has also received additional training recognizing different skin tones and again delivering accurate readouts.

There are also some upsides. Realeyes can tell if the same face appears more than once. This can be an issue with paid surveys, where a subject may want to participate more than once to score a little extra cash, Kalehoff noted.

As for practical application Realeyes worked with Mars Inc. on a project to boost sales using increased attention metrics. The experience yielded an 18% sales increase across 19 markets, optimizing the ad spend by about $30 million, Kalehoff said. Even a five percent increase in “creative attention” can lead to a 40% increase in brand awareness.

Get MarTech! Daily. Free. In your inbox.


Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.


About the author

William Terdoslavich
Contributor
William Terdoslavich is a freelance writer with a long background covering information technology. Prior to writing for MarTech, he also covered digital marketing for DMN. A seasoned generalist, William covered employment in the IT industry for Insights.Dice.com, big data for Information Week, and software-as-a-service for SaaSintheEnterprise.com. He also worked as a features editor for Mobile Computing and Communication, as well as feature section editor for CRN, where he had to deal with 20 to 30 different tech topics over the course of an editorial year. Ironically, it is the human factor that draws William into writing about technology. No matter how much people try to organize and control information, it never quite works out the way they want to.

Get the must-read newsletter for marketers.