Intel kicked off the third and final day of IDF 2010 with a keynote detailing the future evolution of devices it refers to as "contextually aware." This is far from the first time Intel has put a spotlight
on the future of device design; we've covered several announcements in this vein over the past 12 months. According to Justin Rattner, CTO and Senior Fellow at Intel, context awareness will change the nature of how we interact with devices.
Intel's goal is to create devices that can function more like personal assistants and less like PDAs. These theoretical gadgets would gather and process both hard data (physical location, outside conditions) and soft data (calendar, social networking preferences, and your preference history). From this, the device can advise you in any number of ways. According to Intel, this type of mobile computing platform could be used in a number of ways, a few of which are detailed below:
Personal Vacation Assistant:
The PVA is a mobile Internet device prototype that uses a variety of context sources such as personal travel preferences, previous activities, current location and calendar information to provide real-time travel recommendations to vacationers...The PVA can even generate, at the user's request, a travel blog with annotated photos and
videos of sites visited during the trip.
Sensing Human Gait
: More than a third of adults over 65 fall each year in the United States. Falls are the most common cause of nonfatal injuries and hospital admissions for trauma. Through the Technology Research of Independent Living Center, Intel is researching how wireless sensors attached to clothing on people's legs can help predict the likelihood of a serious fall. This prediction is based on inference of data delivered by movement sensors that analyze the way people move their feet, the strength of heel strikes, and the way muscles are moving. This could help elderly people in their daily life, e.g. by predicting [a] fall before it happens or calling for help in case it does happen.
Sensing Human Thoughts:
Intel is collaborating with Carnegie Mellon University on machine-learning technology aimed at decoding data directly from the human brain. Based on this
promising research, computers could one day have the ability to decode what we're thinking, enabling users to control and direct computers, robots, or other devices with nothing but
their thoughts. (Does anyone else think this last idea is really, really, bad? —Ed)
Contextual Awareness means your laptop knows where you are and what you need to do before you even ask.
Intel gives a few additional examples, check the PDF for more details. Intel also took this opportunity to discuss the goals of its Interactions and Experience Research Lab
. According to the company, this new project has four central pillars: Social science / research, Design enabling, technology research, and future casting. Save for Technology Research (defined as building on Intel's expertise in facial recognition and data visualization), these pillars more closely resemble smoke plumes. Future-casting, for example, is described as "a different way of thinking about the future," which seems vague for a core pillar of an institute.
Intel then declares: "Helping context awareness means paying attention to everyone's daily lives and the places and ways we make meaning in our lives."
Translation: The IXR is tasked with making simpler, smarter, and better stuff. Intel's examples of how smarter gadgets could improve lives seem a bit hit-and-miss—part of going on vacation means NOT taking gadgets along—but the medical benefits could be enormous. Although we joked about controlling computers by mental command, such capabilities could be used by certain paralyzed or quadriplegic individuals to communicate far more effectively than they can today. Intel is doing some very interesting work in this field, but we don't think the personal assistants of the world need to watch over their shoulders just yet.