Amir Liberman, CEO, Nemesysco talks about the role that AI and Machine Learning play in improving Emotion Intelligence to understand non-verbal signals
1. Can you please describe Emotion Intelligence and explain why it is important today?
Our understanding of Emotion Intelligence is the ability to understand the current emotional experience of another person and to be able to dynamically tune our reactions to this person based on this understanding.
As humans, we express ourselves with loads of nonverbal signals as part of our communication with others. Understanding these nonverbal cues is essential for understanding the real intentions and current emotional state of a person.
As an example, think about a how a young child asking his mother, who is sick, how she feels. Naturally, the mother will likely say that she feels alright. However, the physical reactions of her body and her facial expressions tell a much different story. A child that is emotionally in tune with his mother will understand that she is really not feeling well despite what she actually said. On the other hand, a child that lacks emotional intelligence will assume that his mother is okay since she verbally expressed that she feels fine.
Since a lot of our work is currently done remotely and our communication channels have been reduced, our use of nonverbal cues in our communication is limited to intonations and deep sensing abilities.
2. How does Emotion Intelligence compare to other existing technological approaches currently in use?
Sentiment Analysis can probably be considered one of the more commonly used technologies, especially in call centers, to try to understand insights about customers.
Sentiment Analysis is a term usually used to describe the evaluation of openly expressed emotions. Take the example of when a customer calls a service center and in an angry voice demands from the service agent “I want to speak to your manager now!”. Sentiment Analysis will only look at the broadcasted message and will assume that the customer is indeed angry because this is what he outwardly stated.
However, maybe the customer was just playing around with the service agent and was just entertaining himself. At the same time, in response to the customer that pretended to be angry, the call center agent may just politely say “thank you for your time”, when in reality the agent is feeling frustration or anger. Real artificial Emotion Intelligence seeks to understand these intentions in order to provide much wider insight and a deeper understanding.
3. Artificial Intelligence and Machine Learning are now being used widely in many fields. How can Emotion Intelligence improve the applications of AI?
Just like our example of the child and his sick mother, proper emotional insight can improve decision making and create far better next-best-action models that will improve interactions with a specific person that is experiencing a specific mood, rather than treating everyone with a general approach. Taking this one step further, using EI – Emotion Intelligence – as part of AI will enable unique responses that do not fit any immediate logical criteria.
The idea is that AI can automate many operational processes to save costs and improve output. AI can even be used in certain customer facing situations. However, it must be remembered that human behavior is not always logical and consequently when humans interact with purely logical systems, the results may be difficult to predict.
To fill this needs, AI+EI-driven customer serving systems put to use in order to attempt to judge customer sentiments and assess their basic personality style in ways that will enable automated and cost effective will benefit from adding EI to output a personal, intelligent and client-centric experience.
4. Would it be correct to say that Artificial Intelligence and Emotion Intelligence contradict each other?
Actually, the contrary is true and it would be correct to say that AI and EI complement each other . Intelligent humans act in ways that combine logic with emotions to reach sound decisions and understandings. This of course should also apply to machines that try to motivate humans and influence our behaviors.
5. What are the advantages of using Artificial Intelligence together with Emotion Intelligence?
When you look at different market segments, different benefits can be highlighted. For instance, a system used for supporting sales that understands customer motivation and individual personality and emotional styles will produce much better results than a uniform system that treats everyone as the same. For mental health and wellness applications, the ability to factor in mood and the entire emotional state of a patient will be in the position to better predict a successful course of treatment. These are of course just two examples. The point here is that the ability to add humanlike understandings that go beyond the spoken words or expressed tonality opens new possibilities to create a relationship between humans and machines that was never possible before.
6. Can you give some practical examples of use cases for your AI+EI concept?
Let’s look at the robotics space. A robot for assisting senior citizens or monitoring children will perform better if it can sense the true emotions and understand the unspoken needs or frustration of the people it is designed to monitor. Another example could be a smart car that can detect that its driver is having a bad day and is not thinking straight, so it will gently override risky driving decisions. Even a mobile phone can learn your emotional preferences, like which contacts you prefer to avoid and in which situations, and make suggestions for you on the fly. With AI+EI models working together, the possibilities for building a rapport with your new brand device becomes a reality.
7. What advice would you give enterprises considering to adopt this AI+EI concept?
Yes, AI-driven processes can certainly reduce operational and labor costs. However, the dynamic needs of the humans involved must always be kept in mind.
My advice is to consider the level of interaction and types of interactions that the AI processes will have with the humans around them along with the level of influence the AI is intended to have on these humans.
Consider carefully the balance needed between the methodological and fixed processes and how different mind sets and emotional states can influence these procedures.
Your employees and your customers are all humans. Each has good and bad days with unique likes and preferences as well as specific situations that they do not like or stress them out. Some are more aggressive than others, some are more thoughtful and some are more careful. Systems that ignore these facts and differences will be far less productive and pleasant compared to those that take these into consideration. Systems without EI will have no place in the world of tomorrow.
For more such updates and perspectives around Digital Innovation, IoT, Data Infrastructure, AI & Cybsercurity, go to AI-Techpark.com.
Amir Liberman is a leading researcher in the field of voice and emotion analysis. He is the CEO of Nemesysco, an innovative provider of voice analytics technologies and solutions for genuine emotion detection, and is the driving force behind expanding the applications of the company’s unique Layer Voice Analysis (LVA™) technology.