Device by MIT Grad Can Almost Read Minds

No it can't read minds in the literal sense, but it's probably still better than a polygraph machine.

They say that only a small fraction of human communication comes from words; about 60% is body language and about 30% is tone. The irony is that most of us are probably not able to pick up on these subliminal messages very well. That might be why Rana El Kaliouby, an MIT grad student, decided to develop the special software used in the Emotional Social Intelligence Prosthetic, which is a device that helps people interpret body language and facial expressions. The device, upon detecting boredom/inattention in the listener, will vibrate to alert the user.
"This software interprets emotions based on body language and facial expressions which are captured via a wearable computer/video camera combination."

"The video data is used to decide whether the listener is agreeing, disagreeing, thinking, concentrating, interested, or unsure."

"To train her software El Kaliouby used video of actors able to very clearly define an emotion on film. Now, the system is able to pick out the right emotion 90% of the time when using actor footage, and 64% of the time with video clips of everyday people."
The rate of accuracy is expected to improve greatly as more footage is recorded into the database. Such a device would be invaluable to autistics, people who suffer from autism, which is a psychological condition characterized by deficits in social interaction and communication, public speakers, including bad profs, or anyone who sees the value in being able to sell him/herself. 
Tags:  DS, Ice, can, MIND, OS, mit, LM, Read, dev, EA, device