Gesture controls can make using technology feel more natural than even voice recognition. But today’s gesture tracking needs depth sensors backed by powerful computers to work, making it impractical to add gesture controls to all but a few devices.
Clay is an SDK that can track a hand’s 3D position, rotation and gesture with a regular CMOS camera, like the ones on virtually every smartphone on the market, and a 64-bit processor as capable as the one in an iPhone 5s. That means virtually any device, from smart hubs to car dashboards to retail signage, can get Minority Report-style gesture controls with little to no change to its hardware—including devices already on the market.
This is possible thanks to an adaptive AI that was 10 years in the making. After a quick calibration process, it learns to distinguish a user’s hands from the background and track its 3D position and pose entirely from 2D information. Most importantly, this process is fast and efficient; on an iPhone 7, it takes roughly 12 milliseconds to process a gesture from sensor input to the app, using just 9% of CPU power.
“Gestures are the single most natural way of communicating for humans, more so than even voice,” says Thomas Amilien, CEO and co-founder of Clay, creators of the eponymous SDK. “Hardware constraints were the only thing holding us back from controlling our devices that way. With CMOS sensors costing less than a dollar at scale, those constraints are now gone.”
For consumers, that means gesture recognition could catch up to and even surpass the spread of voice recognition on phones, smart hubs and everywhere else. Clay is already being integrated into cars by a major manufacturer so that drivers can control their dashboard without having to look away from the road to find a button.
For manufacturers, the ubiquity of cameras on devices is as important as their price. Before Clay, prototyping gesture recognition for devices meant creating entirely new board layouts, adding complexity and slowing time-to-market. By using cameras that are already installed on most devices, manufacturers and OEMs can prototype gesture recognition on any such hardware in a matter of weeks with little risk. It also means such features can be deployed to existing devices, or even via apps from third parties.
Clay technology also has implications for virtual reality and augmented reality, where the user needs to interact with virtual objects in a natural way. It provides gesture recognition comparable to a $3,000 Microsoft Hololens that can work on any modern phone. Clay also works with with Apple’s ARKit, which makes integrating these features on iPhone AR apps fast and simple.
Here are a few smart products that respond to hand gestures:
Canadian-based company Ubiquilux has developed a solution that for less than $80 lets you operate the lights in your house without ever touching the switch. In fact, all you need to do is wave your hand in front of it to activate the lights.
The e-Motion dimmer, available for $79, offers a new, natural, intuitive and convenient way to control lighting. The e-Motion switch has the capacity to eliminate dirt and smudging as well as the risk of infection (since you don’t touch it you eliminate the spread of dirt and bacteria). The switch, activated by a gesture from as far away as 6 inches also allows for easier access for mobility impaired individuals and people with arthritis.
You can wave your hand at the singlecue Gen 2 to change the setting of a Nest thermostat, turn on your TV, and more.
Also, if you wanted to note, feel free to add that for a limited time only singlecue is on sale for $99 on Amazon. https://www.amazon.com/singlec
Bixi is an small remote that senses your in-air hand gestures and commands your favorite smartphone Apps, LifX & Hue bulbs, Internet Speakers, GoPro and many other IoT devices.
The Swipe from Fibaro is a slick looking touchpanelthat attaches to the wall from which you can command lights, thermostats, security devices, and more. But unlike a traditional touchpanel that responds to a tap of a finger on its display, the Swipe reacts to six different hand gestures. Wave your hand left to right across the screen and the Swipe issues a command to a group of lights, for example. Move your hand in a circle, and a different series of commands can be dispatched.