Marvin

Our devices are smart. They know what we type and touch, what we say and where we are. They know what we look like, but are clueless when it comes to how we feel and what we mean.

This still absent bond between humans and machines is also the chief theme of Gartner’s 2013 Hype Cycle for Emerging Technologies report, suggesting that “machines are becoming better at understanding humans and the environment – for example, recognizing the emotion in a person’s voice.” We can agree that human emotions are complicated, and arguably, the human voice might be the most personal and revealing “emotional designator.” Nevertheless, it is the next big revolution waiting to happen – the most important, non-existing interface out there.

Or is it?

We all know that words alone don’t always tell the whole story. To truly understand, we must reach beyond the verbal. In many cases, it’s not what we say, but how we say it. We know this intuitively, and studies in neuropsychology in the last 50 years have demonstrated that body language and vocal intonation have a bigger impact than your actual choice of words.

When you first meet someone, in less than 10 seconds after he or she starts talking, you’ve already formed an opinion about this person. As reported by Carol Kinsey Gorman at Forbes, researchers from NYU found that it takes just seven seconds to make a first impression. Experience the versatile Bryan Cranston as Walter White in AMC’s drama “Breaking Bad” – his voice alternates between the feeble, stomped-upon character of Walter White and a Meth kingpin persona known as “Heisenberg” – you instinctively know who to stay clear of.

“Emotions Analytics” is a new field that focuses on identifying and analyzing the full spectrum of human emotions including mood, attitude and emotional personality. Now imagine Emotions Analytics embedded inside mobile applications and devices — opening up a new dimension of human-machine interface(s). Picture machines (and their users) finally understanding who we are, how we feel and what we really mean.

Can you envision a world, where people are also more in touch with their own emotional sides? This world is right around the corner.

So you need to get with the program. Here are five steps to emotionally power your app:

1. Rethink your entire app. The emerging field of “emotional analytics” changes both the flow and the value proposition of the app. Therefore it requires a total redesign. In the mid-60s, EA Johnson at the Royal Radar Establishment in Malvern (UK) created the first touch screen; that has since sparked completely new user experiences via a plethora of innovative applications. Within just five years, the Apple App Store has grown close to 900,000 mobile apps.

The introduction of a gyroscope built into smartphones has further revolutionized mobile gaming, which in turn, is now being introduced into Google glasses. In all these examples, those who understood the value of the new interface, resisted the urge to make a quick fix to their current solutions and got it right, made it big.

We think that introducing Emotions Analytics promises an equally game-changing impact. It’s more than adding a few extra features; we see a totally revised and emotionally aware interface. And with a fresh interface, you need to also rethink your value proposition. Moreover, imagine the adoption and market share potential across industries – likely to further drive up the 2016 Gartner estimate of 300 billion mobile app downloads annually.

2. Words are overrated. Sorry authors, reporters, speech writers, and other wordsmiths. Cognitive language is a poor emotional yardstick. Yet most of the sentiment analysis industry is focused on words. Think of emotions as your car’s spark plugs – little and hidden, but responsible for the resulting combustion that ultimately powers the car. Similarly, emotions summon the words in your prefrontal cortex; we dress them up by applying cultural filters and social norms and run them through our personalized cognition. The result is by now an almost indistinguishable mix of which emotions are just a small and diluted component.

Speaking of “communications of feelings and attitudes,” the widely quoted formula of nonverbal communications pioneer Albert Mehrabian suggests in “Silent Messages” that only seven percent of our communicational impact pertaining to feelings and attitudes is based on verbal language. The bulk is delivered by body language and vocal modulations. Our intonations are literally tuned by our emotions – happiness or sadness; excitement or depression; anger or anxiety. Free from language, the music of our vocal expression is universal and rings true across races and cultures. And not just humans – think of the family dog.

Ironically, most sentiment analysis solutions are focused on figuring out those seven percent with mixed results. One can of course choose to use an MRI brain scan to crack the mystery of human language. Using MRI, Dr. Sophie Scott at University College London has done just that showing how the brain takes speech and separates it into words and “melody.” Her studies suggest words are shunted over to the left temporal lobe for processing, while the melody is channeled to the right side of the brain, a region more stimulated by music.” Interesting as it may be, donning a Lady Gaga-like contraption on our heads to identify emotions in every day conversations would certainly not meet with applause.

3. It’s not what you say, but how you say it. Emotions live mainly in the intonation and body language. It’s the way we are wired.

You might have heard the French say, “C’est le ton qui fait la musique,” which translates to “it is the tone that makes the music.” It’s the combination our vocal modulations (and body posture – both activated by the same part of our brain that produces emotions), and the memory that the listener holds for that melody, that either makes the person we are talking to listen up or shut down. This is why we can quickly detect the deteriorating emotion fueling Walter White’s growls in “Breaking Bad,” or reflect on the poor fate of Dirty Harry’s antagonists when Clint pops the question, “Do I feel lucky?”

Vocal communication and body language go hand in hand. Posture, gait and facial expression further broadcasts what’s happening inside of us. Sue Shellenbarger, writer of the Wall Street Journal’s “Work & Family” column, reported that “new research shows posture has a bigger impact on body and mind,” and a “powerful and expansive pose actually changes a person’s hormones and behavior.” So, striking a pose might also enhance your impact on your listeners. In other words (literally), it looks like both can become great emotional yardsticks – far superior to language and text. In principle they are. But reality is different.

4. Looking is good. Listening is better. Gaining understanding from visual clues offered by body language and facial expressions is complicated. Researchers at the Hebrew University of Jerusalem and at New York University and Princeton University found that the body language of tennis players provided better cues (than their facial expressions) in trying to judge where an observed player had “undergone strong positive or negative experiences.”

Nonetheless, we are visually wired. Facial recognition apps are everywhere albeit with a catch – getting your subject to gaze directly into the camera and guaranteeing perfect lighting and adequate shading is tricky. Moreover, the “subject” is aware and might “turn it on” as he knows he is on camera. Wouldn’t you?

We are surrounded by voices. It’s present in everything we do – even when we pay no attention, talking while letting our minds drift in thoughts. Listening is effortless and so much easier, as we do it naturally; even in the dark we don’t have problems listening while a camera might have trouble capturing the scene.

Moreover, we can intuitively discern a speaker’s authenticity – even better when in the dim. Our ears listen for those “vocal indicators,” and literally perform an auditory double-take when we hear an increase in speech rates or a decrease in pause duration, which are powerful indicators of mood improvements in patients suffering from depression, for example. We are wired to detect the emotions “resonating” in the voice of the person we are speaking with. And since emotions – positive and/or negative – are paramount to our decision-making processes, applying a vocal intonation driven solution is both easier and superior. Translated into your daily life, wouldn’t it be so much more enriching to develop apps and prime machines to do what we do intuitively all of the time?

5. Think bold. Emotions analytics allows you to gain deeper context and meaning in everything you do. It can transform business, games, media and marketing, and affect your health and dating life, no matter where you (and your customers) are, and no matter what language they speak.

Imagine this initial layer of emotional understanding in every aspect of your life embedded in every voice-assisted and voice-enabled device, allowing you to gain a better hold of context and meaning in practically everything you do. The potential is boundless and the probable deployments and applications are countless. Could this really be the “ultimate technology“?

How can you use it? Where would you expect it? Get creative. Don’t try to conform to existing paradigms. Elevate your app. Light up your mobile app’s emotional blind spot with emotions analytics that crack the code of the human emotional dimension. Let’s transform how we interact with machines and with each other.

[Image Credit: Premiere]