Skinput Turns Your Arm Into A Remote
PCs continue to get more powerful, and mobile smartphones continue to
gain functionality that could have never been imagined just a few years
ago. But little attention in the mainstream is paid to how we control
all the gizmos that we encounter in our lives. The mouse and keyboard
combo has been the go-to tandem for years now, and the touchpad has
been another stereotypical choice for controlling a cursor when space
came at a premium. Brain-control interfaces have been reserved for
universities and labs, while anything more elaborate than a mere
multi-touch display is seen as too wild for Joe Six Pack.
Hopefully, the tide is turning. Over the past one or two years, we have seen engineers at TED detail a "Sixth Sense" type of device that would intertwine the digital and "real" worlds, providing a heads-up view of additional information related to anything we were currently looking at. Now, a team from Carnegie Mellon University and Microsoft are working together to make something you already have plenty of be more useful when it comes to controlling devices. "Skinput" is the novel name for a new physical interaction design which is aimed at using your skin as an input interface, primarily for mobile devices.

The technology marries two main systems: the "ability to detect the ultralow-frequency sound produced by tapping the skin with a finger, and the microchip-sized "pico" projectors now found in some cellphones." The whole solution would project an image of a keyboard or icons onto one's arm, and then any press on the skin would react to whatever icon was present where the finger touched. An acoustic detector (integrated into an armband) is also a vital part of making it all work, but it's still very early on in the development process. Currently, the system is still a prototype, but we could definitely see this growing to become the next great cellphone or iPod remote, or maybe even the television remote that you don't have to grab. But we would hope the designers could force inputs to be recognized by only your fingers; wouldn't want your kids jumping all over you in a fight to change the channel, would you?
Hopefully, the tide is turning. Over the past one or two years, we have seen engineers at TED detail a "Sixth Sense" type of device that would intertwine the digital and "real" worlds, providing a heads-up view of additional information related to anything we were currently looking at. Now, a team from Carnegie Mellon University and Microsoft are working together to make something you already have plenty of be more useful when it comes to controlling devices. "Skinput" is the novel name for a new physical interaction design which is aimed at using your skin as an input interface, primarily for mobile devices.
The technology marries two main systems: the "ability to detect the ultralow-frequency sound produced by tapping the skin with a finger, and the microchip-sized "pico" projectors now found in some cellphones." The whole solution would project an image of a keyboard or icons onto one's arm, and then any press on the skin would react to whatever icon was present where the finger touched. An acoustic detector (integrated into an armband) is also a vital part of making it all work, but it's still very early on in the development process. Currently, the system is still a prototype, but we could definitely see this growing to become the next great cellphone or iPod remote, or maybe even the television remote that you don't have to grab. But we would hope the designers could force inputs to be recognized by only your fingers; wouldn't want your kids jumping all over you in a fight to change the channel, would you?