Margarita is a machine learning and signal processing expert. She has joined Deloitte from Toshiba Research Europe, where she both continued her research (while maintaining an Honorary Visiting Researcher position at Imperial College London) and also managed teams to turn machine learning insights into products. Margarita holds a PhD on artificial intelligence and she has extensive experience in developing predictive models, discovering hidden patterns, and analysing customer behaviour.
By nature, human beings are not exclusively thinking beings, they are also emotional. Emotions can drive human behaviour. During an interaction, another human can (relatively) easily understand how a person fells and adapt to their needs. Devices can be taught to do the same and this is what emotion recognition is about, having systems that can understand how humans feel.
Customer behaviour is greatly affected by their emotions. Understanding customer emotions has always enabled organisations, to deliver empathy, reduce the pressure, and provide more memorable experiences, among others, at crucial points during the customer decision-making process. This delivers an improved customer experience and builds trust and loyalty. Regarding trust, emotionally better of customers are satisfied customers that will recommend your firm’s service. As for loyalty, research by the Harvard business school has demonstrated that even a small increase of loyalty by 5% can increase profits by a minimum of 25%.
Every customer matter, but with you are not always having each and every customer in front of you to use your human intelligence to analyse their behaviour and understand how they feel. Especially now with the pandemic, an increased number of interactions are between customers and their devices. Still, though, you can use Emotional AI to transform your customer experience. To do that, the device uses surrogates to recognise emotion, such as facial expressions, body language, gestures, sensors, text (with emojis for the newer generation ;)) or voice to detect your customer emotional state. For example, you could exploit eye tracking since modern smartphones are capable of accurately tracking the customers gaze point using the front-facing cameras. And you can see where the customers attention is going. And if they start rolling their eyes, well, this is not good!
On the positive side, a device can recognise emotions in real-time, with no overheads for your customer, since the customer will have to interact with your system anyway. No extra effort is requested from the customer.
Let’s start with an example that exclusively uses the basic interface of a banking customer account app on the phone. Even in such a fundamental scenario, you can still use the sensors that are already embedded in a phone to exploit Emotional AI and improve customer experience.
For example, you can use the haptic interface of the phone. If your customer, for instance, presses the screen a large number of times per minute and applies too much pressure, this could be an indication that something is not going that great and you should look into the customer journey of the app layout or run a test on current functionality to improve customer experience..
You may also take advantage of other inbuilt sensors that your phone has, such as the gyroscope, that is normally used for understanding which way your phone is orientated. If the client is shaking their phone while using the banking your app, probably something in the app made them upset. Maybe it is time to check if your app has become unresponsive or propose to your customer to fill in a ticket to capture the details of this poor experience.
There is also the accelerometer, that detects acceleration, vibration, and tilt. and. It is normally used to determine how fast the phone is moving in any linear direction. In Emotional AI context, it can be used to determine movement. If there is minimal abrupt movement, then probably the client is calm, having no issues using theyour banking app. This could be an indicator that now is a good opportunity to advertise your new product to a content user.
And this is with what sensors are already available in all smart phones. You can additionally exploit eye tracking, since modern smartphones are capable of accurately tracking the customers gaze point using the front-facing cameras. And you can see where the customers attention is going. And if they start rolling their eyes, well, this is not good!
You can utilise Emotional AI to evaluate new product or campaign impact or even benchmark against competitors. Social listening can help you monitor the web for references to your firm or your competitors. Most people write their opinions on the web, like on twitter, Instagram, Facebook, blogs, or forums websites. Additionally, sentiment analysis, an area of natural language processing, can help you understand your customers feelings. If they keep complaining about your call centre or credit card issue procedures, maybe it is a good idea to look into it.
Customer’s voice also conveys micro messages that Emotional AI can use to customise the interaction. For example, Emotional AI can inform the customer’s device not to ask the same question twice (like, “Which card would you like to pay with?” if the customer sounds angry or it can offer marketing material (like “Those shoes would be a good match to that shirt you just bought”) to those customers that sound content. Or you can use the Emotional AI to predict that the interaction is failing and forward to a human agent, before the customer unsuccessfully terminates the interaction.
The Association of British Insurers states that detected fraud costs over £1 billion each year. Fraud detection is not easy, but maybe you can use the extra hidden information that the customer provides when they make a claim online. For example, if they use a phone you could use the accelerometer and the gyroscope to check for any fidgeting, that could be an indication of the customer feeling uncomfortable. Wearable sensors via smart watches can even provide more information, such as the heart rate or skin conductance (that is how much you sweat).
Facial emotion recognition in phone apps is already possible – phone cameras provide good quality images that sophisticated programs can analyse on the phone. How can you use this technology in banking, for example? Let’s say that you have a customer that already has their cards with you, and you would like to advertise other products, like your insurance or your mortgages. You can present them an ad and understand from their face, specifically from their micro expressions, where they stand, even subconsciously. Would they be open to a new mortgage with your bank or the find the idea intimidating?