Google Project Soli Controls A Smartwatch With Radar-Guided Hand Gestures

Out of nowhere, Google has unveiled a project that's bound to make the industry at large stand up and take notice. Its new "Soli" platform is exactly the kind we need to spearhead a more "futuristic" tomorrow, one where we can merely wave our hands in the air to take care of business on mobile devices or virtually anywhere.

The brains behind Soli are hidden in a small chip that can be installed into a variety of devices, and uses a miniature radar operating in the 60GHz ISM band that can detect touchless gestures. Imagine changing the time or volume simply be rubbing your index and thumb together. That's Soli in a nutshell.

Soli Hardware Prototypes
Various Soli prototypes from 2014 and 2015

In a quick video, we can see people using Soli in object recognition (based on this video, it's very accurate), 3D imaging, predictive drawing, in-car remote, security, visualization, and even music, where you hover your hand over a Soli device to create different tones and melodies.

The reason Soli is said to be so accurate is that it's purpose-built for these kinds of interactions. And while it's hard to see Soli's real performance inside of a canned demo, it does look like this simple 8mm x 10mm package (which includes the sensor and antennas) can offer some seriously accurate interactivity.

As with most young projects, there's no telling when we can expect Soli to impact the market in a big way, but if you're keenly interested in working with Soli, you should hit up the URL below and sign up for the official newsletter. We'd also encouraging checking out the video above if you want to see some real-world uses where Soli could benefit us.