Emotion AI: changing the face of UX? (Part 1)

User Experience & Usability

The changing face of emotions; a Woman expressing differnt emotions

There are times when we all want to get inside someone’s head; to know exactly what they’re thinking and feeling. Particularly if you work in research, there will always be one participant who you suspect isn’t being completely honest about what they think of a product.

Research is just one of the areas where uncovering true emotion holds inherent value; more and more companies are becoming aware of the power emotion has on consumer behaviour, and they want to tap into it.

In line with advances in artificial intelligence, the emotion detection market is forecast to grow by 27.5% over the next year. It is already being explored by brands including Kellogg’s, Disney and automotive manufacturers to measure reactions and subsequently increase positive feelings towards brands or products. Emotion artificial intelligence (AI) also presents an opportunity for companies to tailor digital products and services by making real-time changes to personalise content.

Woman vs Virtual Man

A basic definition

So, what is emotion? An emotion is a strong, instinctive feeling directed towards a specific trigger, which is characterised by physiological and behavioural changes in the body. As emotions are internalised by a human, they contribute towards a conscious feeling and longer-lasting moods, which are not tied to an individual event or object.

Displaying emotion allows other people to understand us. Although different opinions exist, it is generally accepted that there are six to eight basic emotions. One of the most well-known models is Plutchik’s wheel of emotion, which maps joy, trust, fear, surprise, sadness, disgust, anger and anticipation. All of these emotions except for fear are also acknowledged in the facial action coding system (FACS), revised by Paul Ekman in 1978 to measure emotion through facial expression. This assumes that the emotions are universal between cultures and can therefore be reliably identified using emotion detection technology.

Plutchik’s wheel of emotion
Plutchik’s wheel of emotion

Plutchik suggested that:

  • each primary emotion that can be experienced, has a polar opposite as shown on the wheel (e.g. anger vs. fear)
  • although emotions other than the eight basic emotions can be experienced, these are derived as a combination (e.g. fear + surprise = awe)
  • emotions can be experienced at varying strengths, with those shown at the centre of the wheel having the highest intensity.

Overview of emotion AI

Emotion AI describes an area of artificial intelligence that can recognise, decode, replicate and respond to human emotion. Currently, detection systems can classify emotion through facial expressions, skin hue, biometrics and voice patterns. These systems rely on large datasets used to train deep learning algorithms, which can detect reference points for different emotions, and are continuously improving.

Over the last decade, several companies have developed with the aim of implementing emotion AI to benefit both businesses and users directly. Whereas some suggest that the technology can simply improve wellbeing by making the user aware of when they are in a negative state, others suggest that it can be used to trigger targeted responses. These could vary from changing the environment, to the adjustment of customer service approaches, and automated control of safety critical interactions. Let’s take a look at a couple of examples.

  • In the past few years, several automotive manufacturers have announced work on mood detection systems that use cameras and biometrics to monitor the state of vehicle occupants. Companies including Jaguar Land Rover and Kia plan to alter cabin conditions through systems including media, ambient lighting and ventilation to relieve feelings such as stress and tiredness.
  • Chatbots powered by emotion AI may soon be available as a treatment option for people struggling with mental health problems. A virtual therapist called ‘Ellie’ has already been developed to respond to verbal and facial cues through sympathetic gestures, with patients willing to give more information to the bot, compared to a human therapist.
Look out for part 2 of the article, where we will discuss the current benefits and barriers to emotion AI, and how these might shape the future of the technology…

Speech bubbleGet help understanding your users emotional needs

Contact our experienced consultants to discuss your challenges

More like this

User testing

Can we help you test your product or service? User testing will identify usability problems and highlight ways to improve your users’ experience...

Can we help you test your product or service? User testing will identify usability problems and highlight ways to improve your users’ experience...

User research with primary school aged children

Conducting user research to aid the development of a new educational content platform. Discovery Education supports schools with a range of services ...

Conducting user research to aid the development of a new educational content platform. Discovery Education supports schools with a range of services and opportunities, enhancing the way teachers teach...

Hire our UX labs

Boost your user experience research sessions. Get full details on our superb lab facilities, view the gallery and make your booking...

Boost your user experience research sessions. Get full details on our superb lab facilities, view the gallery and make your booking...