We last saw Elliptic Labs a year ago showing off a way to use gestures with a phone. Now the company has improved its product, making it work with multiple gesture layers. That means you can do different things with a smartphone based on how far away your hand is from the device.
Elliptic Labs CEO Laila Danielson described it this way: “Without touching the device, as your hand moves towards your phone for example, the screen lights up and information is displayed. As you continue to move your hand closer, different information is revealed. It’s all about improving the user experience and by presenting easier ways to interact with mobile devices.”
Here’s a demonstration video that illustrates what Elliptic Labs is capable of now.
[vimeo http://vimeo.com/108103227]
There have been several handsets launched that use either infrared sensors or, like the case of Elliptic Labs, ultrasound waves. And this can be helpful for viewing data without touching the screen or for scrolling through web pages, photos and more. The function I actually liked the best in the example video, however, was the video playback controls appearing as a hand moved closer to the display during a movie. It’s as if the phone anticipates that you’ll want to hit pause or lower the audio volume.
Elliptic Labs has an SDK so device makers and app developers can integrate with the ultrasound sensors. So it’s up to the company to convince equipment manufactures to include its technology in phones, tablets and other mobile devices.