What would Machiavelli do? Emotional intelligence and automization
In the Atlantic yesterday, Adam Grant, professor of management and psychology at the University of Pennsylvania’s Wharton School of Business, wrote about our increasing interest and understanding of emotional intelligence. In it, he tracked historical icons like Martin Luther King Jr. and Adolf Hitler, explaining their unparalleled skills in emotional manipulation, although they quite obviously used their skills in vastly different ways.
From there, Professor Grant mapped out the strides being made to teach our children about emotional intelligence. He cites a theory from Daniel Goleman that states more apt maneuvering of emotional intelligence will lead to fewer social problems. "If we can teach our children to manage emotions, the argument goes," Grant writes, "we'll have less bullying and more cooperation."
Unsurprisingly, the tech community has been riding the coattails of this idea. As new gadgets become more of an extension of our everyday lives, technologies are being formulated that cater to more internal and cognitive use-cases.
At a very simple level there's a British startup called Karisma Kidz. It's an educational program for children that provides games to help improve "children's moods and behaviour." It's taking the idea of an emotional intelligence pedagogy, and bringing it into ed-tech.
Companies, however, are going even further than that. The most glaring one out there is the Israel-based Beyond Verbal. Since 2012 the company has been honing a technology that listens to audio tracks and analyzes the implied emotions from the speaker's voice. It's a simple and somewhat disquieting idea. What's more, is that it's purported to be quite accurate. You can go online and test it out for yourself.
This leads to one of Grant's most prescient points: those who are apt at reading and acting on others' emotions have an intrinsic power. This can be used for better or for worse. He writes:
Emotional intelligence is important, but the unbridled enthusiasm has obscured a dark side. New evidence shows that when people hone their emotional skills, they become better at manipulating others. When you’re good at controlling your own emotions, you can disguise your true feelings. When you know what others are feeling, you can tug at their heartstrings and motivate them to act against their own best interests.Indeed, that would be a great example of the kind of emotional power Hitler possessed. While Beyond's technology is presented as neutral (the company has been inviting developers to use incorporate it into their programs for some time), the power is still omnipresent.
For example, there's the company InfinityAR, also from Israel. It specializes in augmented reality, preparing for what Infinity's CEO believes is an upcoming Google Glass future. The company offers no new technology and, instead, borrows from other services -- including Beyond -- to provide users of Glass-like gizmos as much external information about their current surroundings. Infinity lets users plug into Wikipedia so that information about surround buildings and places is pushed to the users' eyes, along with numerous other publicly available data sources.
Take the example used in Infinity's promo video. A man porting an unnamed wearable device sees a woman at a bar and walks up to her. He looks at her, and Infinity's platform is able to use facial recognition to pinpoint her precise features and obtain her public Facebook profile. This pushes such information as her birthday directly in front of this man's eyes, without her so much as knowing. He then "guesses" her astrological sign. When she responds, her voice is analyzed (using Beyond's technology), and he responds accordingly.
This is a perfect example of the way technology and emotional intelligence can come together. For many, this example is dark and creepy. Infinity merely thinks this is the future. Emotional intelligence, then, isn't about learning and understanding how and why someone says something. It's about using your knowledge of others' internal atmospheres to manipulate them.
We're reaching a point in tech where building a sleek smartphone isn't enough. Startups are learning how to tweak smaller and more precious aspects of our everyday life into new, and some dystopic products.
Professor Grant is right to point out that those with these skills "aren't always using emotional intelligence for nefarious ends." And, it could be argued, that neither are programs like Infinity. The difference is that as programs like these become more ubiquitous, ideas like cultivating personal skills akin to emotional intelligence will become automized. It will no longer be an educational endeavor to teach others about empathy.
Instead, it will be something else; something that's more instrumental and potentially insidious. And Machiavelli would probably adore it.
Image via thinkstock