There is no doubt about it — it seems like we’re interacting with AI apps all day long in our business and our personal lives. This is spurring the growing demand for AI to “understand” how we are feeling and to be able to respond to us appropriately. A recent article I wrote for Forbes discusses this industry focus known as emotion AI, or “affective computing,” in which AI programs can recognize, interpret, process and simulate human emotions. In the article I address the possibilities that Emotion AI can deliver and the limitations that must be overcome in order to develop these types of smart – and emotionally intelligent — programs.
We can see the beginnings of Emotion AI in action with chatbots, which are based on Natural Language Processing (NLP). Some are trained to hear the emotion in a customer’s voice inflections to determine, for example, if he/she is angry or frustrated, as well as by the choice of words that are used. But since there are many ways to express anger and other emotions, a very large dataset is needed to train AI programs to accomplish this.
There are many exciting possibilities for Emotion AI on the horizon. Imagine if an AI healthcare app can identify mental or physical illness based on the way a patient looks or sounds, a marketer can judge a person’s reactions to an ad, or a contact center AI app can route a call to a supervisor if the customer sounds annoyed to avoid further frustration. Emotion AI can provide a critical business edge and a much better customer experience.
While there’s still more work to be done, the possibilities are endless. We just might find ourselves someday having a heart-to-heart with a robot.