In the recent Google I/O Conference this year, Ivan Poupyrev, the Technical Program Lead for the project, demonstrated the technology which enables us to control the devices such as our smartphone or smartwatch without having to touch the hardware itself but by using your clothes or hands. This project is codenamed: Project Soli.

Project Soli uses radar to capture the movement of your hand in the air with extreme precision. The distance that it can detect can be up to less than a millimeter and can be done with the obstructions in between the hand which do the gesture and the sensor which is supposed to translate what we are trying to do with the device. For example, during the conference, Poupyrev was seen rubbing his thumb against his index finger to act as if he was adjusting the smartwatch timing but without the screen.

It seems like this project is going to be implemented to the smartwatch initially as the first board designs are circular and would fit nicely to a watch. On top of that, it is quite amazing that in just 10 months, Google able to make the radar emitter from a PC-sized into a chip which is no bigger than a dime. The developer ready test boards as well as the APIs are expected to be ready later this year with the APIs able to be fully accessible from the developer community.

As for the part whereby Google announced that it can also be done on our clothes, the company is working with Levis to make this happen. Well, we just have to wait and see don’t we?