Touchless gestures powered by ultrasound has become a mark of distinction for Elliptic Labs. The company has new “Multi Layer Interaction” technology designed to bring users intuitive device interactions.
Without touching the device, the person’s hand moves towards the smartphone, the screen lights up and information is displayed. As the person continues moving the hand closer, different information is revealed. With users constantly, frequently, eagerly reaching for their devices throughout the day, Elliptic Labs aims to make a difference in its easy and fast way to get information, from playing games to navigating maps, to using social media, to watching videos. A promotional video says the user can interact above, in front, underneath, double-tapping anywhere around the device, easily turning the device on and off as well. There is an SDK kit for applications. How it works: Ultrasound signals sent through the air from speakers integrated in smartphones and tablets bounce against the hand and are recorded by microphones integrated in the devices. As such, the technology recognizes hand gestures and uses them to move objects on the screen, similar to how bats use echolocation to navigate.
The company also talks about range-gating capabilities, saying that their touchless gesturing technology can easily separate foreground from background, for separating finger motion from wrist, and hand motion from movements or reflections from the body. This prevents unwanted and accidental gestures from being recognized. Overall, the company believes that “Ultrasound offers the best combination of high resolution, 180-degree interaction space, and low power consumption compared to camera or other sensing technologies.” They use an ultra-low power audio SoC for ultrasound processing such as Wolfson audio hubs. They have formed partnerships with Murata Manufacturing and Wolfson Microelectronics.
Comments are closed.