In the near future, more organizations will use emotional analytics to fine-tune their offerings, whether they’re designing games or building CRM systems. Already, there are platforms and software development tools that allow software developers to build emotional analytics into desktop, mobile, and web apps. In a business context, that can translate to mood indicators built into dashboards that show whether the customer on the phone or in a chat discussion is happy, whether the customer service rep is effective, or both — in real time.
Such information could be used to improve the efficiency of escalation procedures or to adapt call scripts in the moment. It could also be used to refine customer service training programs after the fact. In many cases, emotional analytics will be used in real time to determine how a bot, app, IoT device, or human should react.
Although the design approaches to emotional analytics differ, each involves some combination of AI, machine learning, deep learning, neural nets, natural language processing, and specialized algorithms to better understand the temperament and motivations of humans. The real-time analytical capabilities will likely affect the presentation of content, the design of products and services, and how companies interact with their customers. Not surprisingly, emotional analytics requires massive amounts of data to be effective.
Emotion isn’t an entirely new data point, and at the same time, it is. In a customer service or sales scenario, a customer’s emotion may have been captured “for training purposes” in a call or in a rep’s notes. In the modern sense, emotions will be detected and analyzed in real time by software that is able to distinguish the nuances of particular emotions better than humans. Because the information is digital, it can be used for analytical purposes like any other kind of data, without transformation.
What people say is one thing. How they say it provides context. Voice inflection is important because in the not-too-distant future, more IoT devices, computing devices, and apps will use voice interfaces instead of keyboards, keypads, or gestures designed for mobile devices.
Because humans and their communication styles are so diverse, contextual information is extremely important. Demographics, personas, account histories, geolocation, and what a person is doing in the moment are just a few things that need to be considered. Analyzing all that information, making a decision about it, and acting upon it requires considerable automation for real time relevance. The automation occurs inside an app, an enterprise application, or a service that acts autonomously, notifies humans, or both.
Body language adds even more context. Facial expressions, micro expressions, posture, gait, and gestures all provide clues to a person’s state of mind.
Media agency MediaCom is using emotional analytics to more accurately gauge reactions to advertisements or campaigns so the creative can be tested with greater accuracy and adjusted.
Behavioral health is another interesting application. Using emotional analytics, healthcare providers can gain insight into conditions such as depression, anxiety, and schizophenia.
The potential applications go on, including law enforcement interrogations, retail, and business negotiations, to name a few.
A Tough Problem
Natural language processing, which is necessary for speech and text analysis, is hard enough to get right. Apple Siri, Microsoft Cortana, and even spellcheckers are proof that there’s a lot of room for improvement. Aside from getting the nuances of individual languages and their dialects right, there are also cultural nuances that need to be understood – not only in the context of words but the way in which words are spoken.
The same thing goes for gestures. Large gestures are fine in Italy, but inappropriate in Japan, for example. The meaning of gestures can change with culture, which intelligent systems must understand.
As a result, emotional analytics will crawl before it walks or runs, like most technologies.