Google’s Project Soli – bringing gesture control to wearables
Catching everyone’s attention at the Google I/O, its ATAP teased the new session with talked of a wearable that would ‘literally’ blow your socks of. This division, tasked with coming up with cool new stuff is working into what is called the Project Soli – a wearable which is not a watch (as you may think it to be). It’s you!
Understanding that hands are best method one has for its interaction with devices as well as the fact that everything may not be device, Project Soli is aimed at making your hands and fingers the ONLY user interface you’ll ever need.
How is this actually possible? Let’s check out the Project Soli briefly.
Project Soli is really a radar which is of a miniature size and that which can fit easily into a wearable like a smartwatch. Your movements in real-time I picked up by the radar which is then used to alter its signal. Even when at rest, the hand is actually moving although not visible to the naked eye. This slight movement ends up as a baseline responses on the radar. Moving the crossing fingers can also change the signal.
In order to make these signals make sense to an app or service, ATAP will have APIs that tap into the deep machine learning of Project Soli, which is still in its early days. However, it has got us excited. Hands-free is one thing, turning your hands into the user interface is quite another. And no doubt, it seems that this is going to be way cooler than the voice control ever was.
Hand motion vocabulary
Project Soli replaces the physical controls of smartwatches with hand gestures using broad beam radar to capture your movements. It is used to measure Doppler image, IQ as well as spectrogram. The chip recognizes movement, velocity and distance and can be
The system uses broad beam radar to measure Doppler image, IQ and spectrogram. The chip recognizes movement, velocity and distance and can be preset to change the input based on that distance. The idea is to use some of the same gestures you’re already using for interacting with your mobile devices and enhances those with hand motions that seem natural to interact with a Soli-friendly device. Proponents say that this hand motion vocabulary makes user interfaces much more intuitive and easy to use and opens up new avenues for designers and developers to bring about far better user experiences.
Radar to gestures
A fundamentally different approach to motion tracking has been taken up by Project Soli. This particularly unique approach depends on radar which is used to detect objects in motion through its high frequency radio waves. What this radar does is beams out continuous signals that gets reflected by the arm. Being an extremely complex wave signal, Project Soli provides signal processing and machine learning methods to detect gestures. Since Soli’s sensors can capture motion at the speed of 10,000 frames per second, it is said to be much more accurate than camera-based systems. With the Soli, you can now do all those things you’ve never imagined to do with a camera. Soli is believed to detect very high accuracy.
When something as simple as tapping your index finger with your thumb could be equivalent to tapping a button, we clearly see the ton of possibilities with the Project Soli technology. The Soli team is still working to finalize the board and it seems it has metamorphosed from a size of a pizza box to about the size of an SD card in just 10 months. More information about this awaited revolution is expected to be revealed later this year.
Dhruvil is a Writer & Marketeer for Nimblechapps, joined in December 2014, based out of Sydney, Australia. He has worked briefly as a Branding and Digital Marketing Manager before moving to Australia. At Nimblechapps, he worked on Social Media Marketing, Branding, Email Marketing and Blogging. Dhruvil studies Business at University of Western Sydney, and also handles Operations for the company in Australia.