We assume that gesture control will be the wave of the future, if you'll pardon the pun. And we also assumed it would be perfected by developers tweaking camera-based information. But now Elliptic Labs, a spinoff company from a research outfit at Norway's University of Oslo, has developed the technology to read gestures via sound. Specifically, ultrasound.
In a weird way this is somewhat tied to Norway's oil boom. In addition to the medical applications of ultrasound, Norwegian companies have been using ultrasound for seismic applications, like scouring the coastline for oil deposits. Elliptic Labs emerged from the Norwegian "ultrasonics cluster" that popped up to support industrial needs, and the eggheads at Elliptical subsequently figured out how to use echolocation on a micro scale to read your hand's position in space.
With Elliptic Labs' gesture recognition technology the entire zone above and around a mobile device becomes interactive and responsive to the smallest gesture. The active area is 180 degrees around the device, and up to 50 cm with precise distance measurements made possible by ultrasound... The interaction space can also be customized by device manufacturers or software developers according to user requirements.
Using a small ultrasound speaker, a trio of microphones and clever software, a smartphone (or anything larger) can be programmed to detect your hand's location in 3D space with a higher "resolution" (read: accuracy) than cameras, while using only a miniscule amount of power. And "Most manufacturers only need to install the ultrasound speaker and the software in their smartphones," reckons the company, "since most devices already have at least 3 microphones."
The demo of the technology, which they're calling Multi Layer Interaction, looks pretty darn cool:
The company is making a concerted push to reach out to technology companies that could integrate MLI into their devices, having expanded out of Norway to open offices in both Palo Alto and Shanghai. They've also released an SDK for developers ready to give MLI a go, so with any luck we'll see it popping up on devices in the near future.
Via Talk Android
Create a Core77 Account
Already have an account? Sign In
By creating a Core77 account you confirm that you accept the Terms of Use
Please enter your email and we will send an email to reset your password.
Comments
However,I would love to be able to interact with my crappy Tech without a friggin mouse and keyboard. Carpal sets in fast for those of us trying to create artwork in a digital format.
It would need to be more or less optional, as well as subtle. But that raises the problem of interface false positives. A delicate situation for mass application.
Gestural, physical movement requiring you to hold appendages in mid air is not efficient.
I would rather go down the route of interfaces being used with eyes. Now that's futuristic!