CEO Rana Gujral of emotion AI tool Behavioral Signals on opportunities for businesses using AI-mediated conversations aligned with customer profiles
1. Tell us how you came to be the CEO at Behavioral Signals. How much of your typical day involves innovating AI tech for your customers?
I became the CEO at Behavioral Signals in December of 2018. Over the years, I’ve been fortunate to be a part of some incredible, dynamic and successful teams and ventures. That said, I’m most passionate about working on revolutionary, innovative technology opportunities that we can build a successful business outcome around. At Behavioral Signals, we have a terrific opportunity to impact the Natural Language Processing (NLP) landscape by bringing to the equation our specialized abilities of unravelling insights from acoustic signals. I spend a sizable portion of my time working with our teams to find real-life implementation opportunities of the disruptive capabilities that we’ve put together in this space. This is the most fun part of my job.
2. From technology to analyze chemicals with your startup TiZE to AI for analyzing emotions at Behavioral Signals, what are the similarities and differences in your journey as a tech-preneur earlier and a tech leader now?
There’s a saying, “Once an entrepreneur, always an entrepreneur.” I believe this is true since being an entrepreneur is not a job title, a career choice, or a buzzword, but instead, it’s a personality trait. Above all, it’s a mindset. You don’t have to be an entrepreneur to be a startup founder and a startup founder may not be entrepreneurial. So with that, for me, there are a lot more similarities between these two journeys especially in terms of how we execute. Our focus areas are of course very different. At TiZE we were determined to disrupt the over-reliance of a massive industry on antiquated tools. At Behavioral Signals, we’re putting together never seen before novel technologies to solve for very complex business challenges.
3. What are some of the industry sectors that Behavioral Signals caters to?
Behavioral Signals works mostly with the financial sector integrating AI predictive analytics based on emotion and behavioral AI analysis of speech. While our technology offering is focused towards FSI or banks, we work with other industries like health, collection agencies, contact centers, BPOs, and even robotics. The key value we add is complex emotional and behavioral information that helps our clients have better conversations with their customers. We’re able to analyze tons of conversational data and provide insights that help our clients reach their desired outcomes. We gather insights before the conversation, during, and after, and our advanced use cases often involve financial negotiations for loans, credit checks, payment requests, and so on.
4. Behavioral Signals was recently in the news for helping Europe’s national bank for restructuring loans. Can you elaborate on the challenges they were facing that your product resolved?
Great question! So, what we did is help a bank collect more money from customers who had fallen behind on their loans. More specifically, non-performing loans or NPLs, which means the customer was unable to keep up with their payments, and as expected had interest piling up on their debt. You will find these kinds of loans mentioned as loans in the red, and banks struggle to convince their customers to pay them off.
The usual method is a random employee, from the bank’s contact center, who will call up the debtor and try to negotiate a debt restructuring scheme, which the debtor will or won’t accept. This is where we step in. We proved that if you pair the right employee with the right customer (debtor in this case), you can have a much better result towards the desired outcome. In this case, we managed a 20.1% raise in submitted applications for debt restructuring, just by using our AI technology. We were able to analyze not only historical and current data to create a behavioral profile of both customers (the debtor) and bank employees and pair them in such a way that we affected their conversation positively.
In the end, it’s not about finding the best employee to negotiate, it’s about how well we get along with some people. With those that we ‘click’ we usually are more relaxed, more friendly, more open to negotiating, and often reach a common understanding much more easily. So, not only did we improve financial outcomes, we were able to do that with 7.6% fewer calls, customers surveyed said they were more satisfied by the bank, and the bank’s employees felt happier because their conversations were less stressed.
Tough conversations, like negotiating a non-performing loan, are not easy but that doesn’t mean we can’t bring a human element into the customer experience while striving for higher performance.
5. Can you tell us how businesses benefit by integrating Oliver API into their existing systems?
The Oliver API is the link to the core of our technology. We produce a wealth of emotional and behavioral information from voice, including signals, like happiness, engagement, anger, confidence and even propensity to pay. We do this by listening to HOW something is being said and not just WHAT is being said. This is incredibly powerful in protecting a user’s privacy. Essentially, we don’t work with text or content of the conversation, instead, we work with the tonality of the voice. A business can use our API to build whatever kind of service or product would make good use of these insights. It could vary from a robot that helps the elderly keep up with their medicine intake, to a smart car that understands when its driver is distressed. The possibilities are endless. Obviously, we can’t be everywhere but companies are welcome to use our API to produce their own products.
6. Does Oliver API add emotion AI only to external customer-facing applications or can it be customized for internal organizational requirements like HR as well?
The Oliver API takes voice data and extracts intelligent insights, let’s call them predictive analytics. It does not add emotions, but it can help to add emotional responses to speech if programmed in that way. For example, the above-mentioned health robot would have to be built with empathy in mind to cater to older or sick people. Humans, by nature, sense and understand emotions in voice. We communicate using emotions in our voice, we empathize when needed, and we expect inanimate objects to be able to communicate in the same way, in the future.
7. How do AI-Mediated conversations ensure enhanced privacy? What data sets are processed to ensure customers are paired with the right customer service agents?
AI-Mediated Conversations (previously Behavioral Profile Pairing) uses voice data that is anonymized and cleared of any personal sensitive data, to protect user privacy. We don’t analyze names, numbers, or any facts that could show who the user is. We don’t even analyze what is being said but only HOW it is said! Think about it… it’s like listening to someone speaking in a foreign language you have no knowledge of. While you may not understand what is being said, your brain is very capable of capturing if someone is anxious, agitated, or happy. Now scale that up to analyzing thousands of conversations in seconds. That’s what our AI is capable of doing! And we do it without analyzing words but only voice tonality, which makes our approach very suited to protecting a user’s privacy.
The pairing happens using a behavioral profile which is created by gauging the speaker, agent, and debtor, on a number of behavioral attributes, as these are exhibited during interactions (historical or current). The profile is cumulative, i.e. processing results of new interactions are combined with existing data, improving the accuracy.
8. What are the common challenges that some of your prospects aim to resolve with your tool? What industry sectors will emotion AI most impact?
The most common challenge is convincing banks that they need AI as part of their digital transformation. The financial sector remains largely a conservative sector. But, we’re getting there one bank at a time. Finding the right KPIs and developing solutions is how we help them adopt this new technology. We work with our clients, regardless of industry, in close partnership through their AI journey.
9. How can the retail industry use emotion AI to enhance customer satisfaction and ensure loyalty? Does Behavioral Signals have any play in this sector?
Right now, we’re not working with retail, although we’d be interested.
The biggest issue is retailers don’t actually collect voice data from their customers. While some online retailers may use text-based chat assistants on their e-shops, voice assistants are not yet common. But if they do become commonplace we can definitely help. For example, we’re able to predict in the first 30 seconds of a conversation if the customer is ready to buy.
10. How can emotion AI work in the healthcare sector to pre-empt emotional triggers for mental health patients suffering from bipolar disorders, PTSD, anxiety, depression, and other such issues?
I touched upon this earlier. It’s a matter of designing and developing solutions that learn from emotion AI insights, and create conversations and actions that are closer to how humans communicate. A social robot that helps with caring for a child, or a psychology assistant that will understand the vulnerabilities of patients with mental health disorders, with empathy.
11. A common drawback with emotion detection AI is customers masking their true emotions. How does Behavioral Signals ensure the accuracy of the data sets in such a situation, so customer requests are not “misread”?
While we often meet people who are great at masking their facial or body expressions, to avoid emotion detection, it is not easy to do so with voice. Voice is produced using our vocal cords, our lungs, and even our stomach. It’s a very complicated process with a lot of variables and not very easy to manipulate. If it was we would all be Hollywood actors. It takes actors years of training to manipulate their voices to express specific emotions, and they’re not always successful at it.
As for being misread, that can happen. There’s no bulletproof way to detect affirmatively an emotion with 100% accuracy. Say, for example, disgust vs resentment. Our results are an aggregate of data and not a specific point of measure, and each emotion has its own accuracy tracker.
12. Can you give us a sneak peek into some of the upcoming product upgrades that your customers can look forward to in a year or two?
When AI is combined with different modalities such as Voice, possibilities are endless. Leveraging deep learning, a company can process all customer interactions, learn and improve their services while offering new ones; such as banking without human presence while still maintaining an “affective” relationship with their clients. These are some of the areas we’re innovating in.
13. The need for being ethical in the use of AI for business is being much spoken about. How can enterprises, in their enthusiasm to enhance CX, ensure they are responsible and transparent when using emotion AI?
It is an important question that we take very seriously. We started by having a code of ethics published on our website, openly, and we strive to listen to every voice. We all have to learn to be responsible. Although programming AI without human bias is not easy, it is created by humans, after all, a company should analyze each process and see how they can improve. That includes dealing with customers. You need to protect both privacy and have in place security guidelines that protect not just your customers’ interests but also their data. Meanwhile, there are concepts being developed, like Federated Learning where data is analyzed where it lives, on the customer’s mobile or better algorithms that improve data anonymization. The good thing is that this debate has opened up; and it’ll eventually produce new solutions and ideas.
14. Which is the AI tech breakthrough you will be on the lookout for in the upcoming year?
Understanding the methodologies inside the AI black box is crucial for building trust in the AI system. In 2019, several companies released a few essential services to allow businesses to underline the prime factors that lead to outcomes from their machine learning models. Using these services for the first time, we’re able to get a peek into how the ‘black box’ works. This is still in the early stages but such capabilities will play a major role in democratizing AI. I’ll be watching this space closely.
15. Please share a recent piece of content (can be a video, podcast, blog, movie, webinar) you liked that set you thinking about emotion AI, NLP, or AI tech in general?
While not directly related to AI, I love this blog by Aditi Syal on concepts of storytelling – 7 Powerful Writing Tactics Every Writer Should Know. I think that allowing a business to do effective storytelling is a major area of opportunity for emotion AI.
16. What is the one leadership motto you live by?
“Everything you want is on the other side of failure.”
CEO at Behavioral Signals
Rana Gujral is an entrepreneur, speaker, investor, and CEO at Behavioral Signals, an emotion AI and behavioral recognition software company. In 2014, Rana founded TiZE, a cloud software for specialty chemicals, and held the role of CEO until his exit when TiZE was acquired by Alchemy. He has been awarded the ‘Entrepreneur of the Month’ by CIO Magazine, the ‘US-China Pioneer’ Award by IEIE, and listed in “Top 10 Entrepreneurs to follow in 2017” by Huffington Post. Due to his notable thought-leadership in the voice-first community, spearheading digital transformation in the behavioral analysis space, he is recognized in Inc. Magazine as an “AI Entrepreneur to Watch”. His writing can be found in publications such as Inc., TechCrunch, and Forbes.