With Emotion Recognition Algorithms, Computers Know What You’re Thinking

Back when Google was first getting started, there were plenty of skeptics who didn’t think a list of links could ever turn a profit. That was before advertising came along and gave Google a way to pay its bills, and then some, as it turned out.
Thanks in part to that fortuitous accident, in today’s Internet market, advertising isn’t just an also-ran with new technologies: Marketers are bending innovation to their needs as startups chase prospective revenue streams.
A handful of companies are developing algorithms that can read the human emotions behind nuanced and fleeting facial expressions to maximize advertising and market research campaigns. Major corporations including Procter & Gamble, PepsiCo, Unilever, Nokia and eBay have already used the services.
Companies building the emotion-detecting algorithms include California-based Emotient, which released its product Facet this summer, Massachusetts-based Affectiva which will debut Affdex in early 2014, and U.K.-based Realeyes, which has been moving into emotion-detection since launching in 2006 as provider of eye-movement user-experience research.
They’ve all developed the ability to identify emotions by taking massive data sets, videos of people reacting to content, and putting them through machine learning systems. (The startups have built on a system of coding facial expressions developed in the 1970s for humans to carry out.) Machine learning is the most straightforward approach to artificial intelligence, but it’s mostly limited to deductive reasoning and isn’t likely to give rise to nuanced artificial intelligence.