emotions

The machine sees your face and it knows how you’re feeling.

At ad:tech in San Francisco late last month, amongst the ocean of long winded digital marketers, was a small demo display for a San Diego company, Emotient. It was like a live security camera display, but when you went up to the screen a box zeroed in around your face and ascribed an emotion to your demeanor. I winced. It classified this as disgust. I laughed. That was joy.

I was intrigued. (The machine could not register that.)

So much attention has been given to the superiority of digital advertising. It’s responsive. You know what your customers feel and how they respond to your message. But there’s a wider movement at hand to better study how we interact with the real world: our gaze, where we look, how we’re feeling in all of the times that we’re not looking at a digital device.

For how instinctive emotion is, as Emotient’s President and CEO Ken Denman explains, there’s a world of computation that has gone into making this work. Denman and his friend Seth Neiman, an investor in Emotient, knew they wanted to do something like this for a new company and began studying new developments in the technology for tracking touch, gesture, gaze and customer psychology.

The two circled in on a doctoral team at the University of California San Diego who were the leading national experts in facial recognition technology, judged, according to Denman, on the number of article citations. Denman and Neiman weren’t the first opportunists to try to lure this group away from academia. “It took some convincing. From an IP standpoint they needed to not be employed by the university and they made the jump as a team. I had to go back and negotiate a technological transfer at UCSD for their work,” Denman says.

Emotient began officially in 2012. Following some early trials with customers, four new pilots will begin in the next two months. The product can read about 10 different emotions. It started with basic feelings, like contempt, joy, and fear and has since added in more complicated notions, like frustration and confusion.

Behind recognizing each of those faces lies endless legwork and expert analysis. Denman stresses that, more than anything, Emotient is an analytics company. The machine requires a minimum of 10,000 images to start discerning an emotion. But to do this with high degree of accuracy, and thus to make the product commercially relevant, millions of images can be required.

“It takes a lot of training and a lot of data,” Denman says. “We need to feel comfortable that we’re putting the right positives on things.”

False positives are unavoidable though. Denman says that teaching a machine to recognize that humans can laugh when frustrated or act with shock and surprise when happy is next to impossible. But he says that Emotient’s accuracy is well above 90 percent depending on the emotion and when you’re looking at an aggregate analysis of a larger crowd, that’s more than enough.

That focus on the crowd over the individual keeps Emotient out of thornier issues around privacy. Denman admits that he gets asked the privacy question a lot by journalists. “Some people respond to this and say, ‘you’re evaluating my emotions, it’s kind of creepy,’” he says.

It is an understandable response. But Emotient has set itself up to have a small footprint and has taken itself out of the data handling part of the equation. Emotient will eventually be offered via a software as a service model. The customer will buy a third-party video camera and install Emotient’s software on its own servers. All they need is a cheap camera and a MacBook Air, Denman says. The emotional analytics from each image is aggregated and stored locally and the source picture is deleted.

There’s little commercial value for clients in knowing how one specific person is feeling, which makes it easier for Emotient. “We don’t care who you are. We have no interest in the individual. The value is in the group and the subgroup level,” Denman says.

In a response typical of a job interview, Denman says that for Emotient the biggest risk is how broad interest in its product has been. Conversations have taken place with the auto industry, pharmaceutical companies, healthcare, restaurant companies, digital signage firms and big box retailers. Few companies with a physical presence wouldn’t be tempted by being able to know the emotional state of people who come through the doors. Denman wants to keep Emotient focused on retail and healthcare for now and not try to fulfill every customer desire for the product.

“We’re in a good position, but I’ve seen companies succumb to complacency before and stop innovating,” he says. “Our real advantage is that we can operate with accuracy in real time. Very few players can operate in the uncontrolled environment we can.”

Seeing Emotient in action, there’s an eye-catching novelty to it. The company has the benefit of operating in a new space with an impressive technology. But many of its future clients will have experience using social media and electronic mediums to calculate customer satisfaction and brand affinity, approximating that out into the real world. Beyond the real-time aspect of Emotient, its long-term success will ultimately come down to how much extra, actionable value will come out of its insights.

[GIF by Hallie Bateman for Pando]