Facial recognition technology: the key to consumer emotion detection
Chris Breslin on facial recognition technology, and how market researchers will be able to use it to detect, analyse and report on customer emotions.
It might sound like the premise for a futuristic Sci-Fi movie but the ability to accurately detect, analyse and report on emotions from the tiniest facial movement is no fantasy.
One of the hottest trends in market research technology development in 2016 is facial and emotion recognition and it is becoming increasingly requested by organisations on the look-out for the most cutting-edge research tools.
Understanding emotions is hugely powerful in market research, but notoriously difficult to achieve.
Facial expressions are strongly linked to emotions, and research organisations have traditionally used human observation of recorded videos or simply asked survey respondents how they feel to try to assess emotional response.
Human and self-assessment has many limitations, however, and facial expression recognition technology offers an opportunity to overcome some of these limitations, delivering a much greater level of insight about personal sentiment and reactions.
According to research by Dr Paul Ekman – a pioneer in the study of emotions and facial expressions and professor emeritus of psychology in the Department of Psychiatry at the University of California Medical School – brief flashes of emotion displayed on the respondent’s face, or ‘micro-expressions’, reveal a person’s beliefs and their propensity to act or buy.
The scope for this innovative technology goes beyond pure research.
Customer experience leaders have declared 2016 ‘The Year of Emotion’, continuing the trend for market research and voice of the customer to become increasingly complementary disciplines. Emotions drive spending and loyalty. Organisations managing research programs and customer experience activities can use emotion detection technology to analyse people’s emotional reactions at the point of experience.
This knowledge not only gives researchers a greater understanding of behaviour patterns but also helps predict likely future actions of that consumer.
The result is an unprecedented level of insight into what impacts customer emotions – valuable information that can drive better business decisions to improve product and service offerings and experiences.
Facing new challenges
Market researchers are under increasing pressure to deliver real business value to their customers at a time when survey response rates are declining, limiting the collection of data from specific demographic groups.
The ongoing challenge is to find new ways to complement panels, focus groups and surveys and emotion detection is providing real opportunities. The primary use case for those researchers implementing emotional detection is ad testing.
Traditionally, respondents will answer questions about the advertisement they’ve been shown, rating it on various scales.
While broadly effective in most cases, this is dependent on the respondent’s ability to recall what they’ve just been shown, their interpretation of their own emotions, and their ability to put those emotions into words.
With this view-then-report approach, some fleeting emotions may not even be recognised by respondents who are more likely to remember how they felt at more memorable points in the advert and at the end.
Researchers can also observe and record emotions while the video content is being shown, but this needs specific skills and is difficult to perform consistently with different observers interpreting emotions differently.
Technology that monitors facial expressions throughout the viewing stage bypasses these issues by capturing data as the respondent views a video, enabling advertisers to understand how the tiniest elements of their video may impact audience response.
While there are several different facial recognition technologies currently on the market, and more in development, they all have a similar methodology – they all capture video of a respondent’s reaction to a video and analyse their facial movements for a correspondence to emotions.
Typically, the human-observed system used is called FACS – Facial Action Coding System.
This facial imaging uses machine-learning algorithms to build a huge reference database of expressions against which to judge the face being viewed.
It’s not dissimilar to how text analytics systems use a large quantity of relevant text in order to ‘learn’ how to categorise particular words, phrases and verbal expressions.
A camera focused on the respondent – either in a formal panel scenario or via their own mobile devices or computer webcams – then captures their changing expressions as they watch a video.
The facial expressions captured are analysed and compared to a standardised set of emotional responses – such as anger, joy and surprise – with an aggregated result that represents the overriding split of emotions at key points in the study.
Researchers can then compare the aggregate emotional performance of their video clip against a benchmark.
Dr Ekman’s study has found there is universality of expressions and micro-expressions that relate to specific emotional responses, so using technology to capture those facial movements and analyse them against benchmark data is hugely powerful.
Some tests reported an accuracy rate of around 95% which, by any measure, is impressive.
A number of leading firms are already using the technology to refine expensive advertising campaigns according to respondents’ reactions to preview adverts.
This has empowered marketing teams to not only create the most effective advertising possible, but to also localise content for global campaigns by determining the precise elements that might resonate or alienate a particular market.
We might all be part of one great, global economy now but there are still many cultural differences that separate us.
People from different nationalities and cultures have different levels of emotional response, and different facial structures, so benchmark data needs to take this into account.
To do this, technology providers work with respondents in key regions so they have experience in capturing the micro-expressions and indicators that determine the emotion.
Hooked on a feeling
Like many next ‘big thing’ innovations, emotion detection software adds to the toolkit available to the experienced market researcher.
It may further reduce the need for focus groups but, beyond that, it’s an addition, not a replacement.
Such videos will, in most cases, be embedded in our old friend, the survey, and additional information will be required to understand more about the respondents themselves.
As with most advances of the last decade – mobile, social analytics, text analytics, beacon technologies, and more – emotion detection will find its place and help forward-thinking researchers to continue to add value to the services they provide to their customers.
Chris Breslin is country manager at Confirmit
* * * * *
Purchase a subscription to Marketing
* * * * *