Researchers and startups have begun to focus on the final frontier for our silicon-based buddies -- teaching computers to read and respond to human emotions.
They haven't quite designed a real life Andrew from the Bicentennial Man, but they have lead computers to grasp the basics. At Cambridge University in England, they are trying to teach computers to measure stress, detect how it effects facial expressions and then to interact using a persona that mimics a proper emotional response. Basically, they are building an emotionally sensitive computer.
Researchers at Cambridge are building Andrew's precursor.
They aren't the only ones. Kay Stanney, owner of Design Interactive says that people give off measurable indications of their mental/emotional state, from heart rate to eye tracking to electroencephalography (EEG), the measurement of brain waves. EEG is for real -- there are controllers and apps on the market right now that let you control a computer game using an EEG-based headset.
Design Interactive has bigger plans than gaming. It is working with DARPA and other military agencies to build something it calls the Next Generation Interactive Systems, or NexIS. This uses biological sensors to determine a soldier's state of mind. The technology could be applicable to other occupations where high stress is part of the job, like air traffic controllers.
Biological sensors can already fairly accurately detect when a human is drowsy, distracted, or in an extreme state of distress, but when it comes to detecting more subtle emotions, like frustration, impatience or pride ... a computer algorithm will fall short. Given that people can't reliably read those emotions from their significant others, if these researchers succeed, computers could become better at human relations than humans are.
The entire field of research is known as "affective computing" and within it a broad range of fascinating work is being done. Researchers at MIT have been working on a means to help autistic children better recognize the emotions of the people in the world around them. In 2007, they built something called the Emotional-Social Intelligence Toolkit, and built a prototype which was dubbed at the time, the world's first wearable affective technology.
Affective computing attempts to make computers more emotional.
Since then, the AC MIT team has been experimenting with ways to create all sorts of sensor technologies, via earphones, earrings, wrist bands, even embedded in Webcams. The folks at MIT are trying to solve the problem of being able to study human emotions better in real world situations because there's something about bringing human subjects into the lab that just doesn't bring out the wild thing in most of us.
Should researchers successfully develop both the sensory devices and the computers smart enough to use them, it could stop another growing problem: emotional abuse of computers. Some 40% of people admit that they swear and verbally abuse their computers, according to a recent study by San Francisco based EmSense. Perhaps nicer computers will make nicer human beings.