Touchscreens may be all the rage in the mobile gadget universe at the moment, but the next evolution of the touchscreen interface may just be your arm.
No, really.
Microsoft and Carnegie Mellon University have been working on something they call "skinput", which uses sensors in an armband to track where a user taps his arm. Rather than use motion tracking, Skinput measures the vibration of your finger poking you arm, palm and fingertips, each of which produce acoustically distinct vibrations when tapped.
Project team member Chris Harrison, from Carnegie Mellon's Human-Computer Interaction Institute, told TechDailyNews that the team has mapped ten distinct points on the arm that can be distinguished by the sensor software - more than enough for controlling most mobile devices - with accuracy of close to 95%, even while jogging.
The technology is several years away from being commercially viable, but the research team envisions a wristwatch-like device loaded with this technology linking with any compatible mobile device to effectively make the arm an extension of the limited real estate of a mobile screen, with a pico projector projecting a GUI onto the arm to control the device.
NEXT: Next-gen touchscreens Part 2: Minput