Comp Sci professor Patrick Baudisch, leading a team of researchers at Germany's Hasso Plattner Institute, has developed a gesture-based interface that allows users to use their palm as an input device. A wearable camera tracks the position of your fingers in space, and you basically tap out commands on your palm as if it were the surface of, say, your iPhone:
It's similar to MIT's wearable "Sixth Sense" interface design from 2009 but, as the MIT Technology Review has pointed out, Baudisch's system is different in that there's no projected feedback—you'd have to look at your actual phone, placed nearby, for visual cues—and the gestures you'd perform rely on your memory of where the icons are on the actual phone.
Create a Core77 Account
Already have an account? Sign In
By creating a Core77 account you confirm that you accept the Terms of Use
Please enter your email and we will send an email to reset your password.