Alright Google Glass
haters, I’m actually kind of with you on this one. A new app for Google
Glass (aka “glassware
”) is designed to read people’s emotions and respond accordingly, and if there’s anything people dislike more than being stared at by a Google Glass wearer, it’s wondering if that person (most likely in this case a customer service rep) is using your facial expression to calculate your feelings on a product, store, or experience.
Here’s a wild prediction: Most of the expressions being read will not be happy ones. (Also, there’s a less expensive way to gauge people’s emotions that’s been around for a long time, and it’s called “looking them in the eye”.)
The company behind this particular technology is Emotient, and the glassware is a practical implementation of the company’s facial recognition software, which can track 19 facial muscle movements to determine complex emotions, and it looks at finding seven primary expressions (which are joy, surprise, sadness, anger, fear disgust and contempt) and general feelings of positivity, neutrality, and negativity.
Emotient just secured $6 million in funding, and it hopes that its software will proliferate commercially thanks to the new Emotient API.