Google developing touchless screen that you control with your fingers in mid-air


Google is developing a radar-based technology that allows for no-touch screens that are manipulated by your fingers in mid-air.

Project Soli by Google is the latest technology geared at solving the problem of fat, fumbling fingers on a tiny touch screen that has caused frustration for many with a tendency for erroneous swiping mistakes.

“We want to break the tension between the ever-shrinking screen sizes used in wearables, as well as other digital devices, and our ability to interact with them,” the company’s website boasts.

The demonstration videos make it appear that the microchip-based technology can be manipulated using twiddling motions with your fingers that appear you are pulling invisible strings in mid-air; or perhaps by playing the world’s smallest imaginary violin.

One can imagine the implications, especially for platforms like Tinder, where instead of swiping left or right, one can twiddle clockwise or counter-clockwise.

According to the Google announcement, “The Soli sensor can track sub-millimeter motions at high speed and accuracy. It fits onto a chip, can be produced at scale, and can be used inside even small wearable devices.”

Apart from solving the problem of having greasy fingerprints spread across shiny surfaces, the implications for potential future patents for touchless technology is vast.

This could mean the elimination or alteration of keys, remote controls, telephones, radios, etc., as well as potential military purposes (i.e. trigger-less guns). Remote detonations with the wave of a hand? The sky is literally the limit.

Read More: Private data collection: from Facebook apps to a generation of microchipped biohackers

“Capturing the possibilities of the human hand was one of my passions,” said Ivan Poupyrev from Google’s Advanced Technology and Projects (ATAP) group.

Project Soli’s Lead Research Engineer, Jaime Lien, added, “The reason why we’re able to interpret so much from this one radar signal is because of the full gesture recognition pipeline that we’ve built. The various stages of this pipeline are designed to extract specific gesture information from this one radar signal that we receive at a high frame rate.”

Google’s developers are hoping to mimic human intent by imitating the same motions your fingers would use on a physical object, only there would be no object physically present between your fingers.

1 Comment

  1. Tim I recently hired three new stnedut assistants for my Injury Prevention Program at UC Santa Barbara. I’ve always thought of my job at UC to nurture these stneduts as they take some of the pressure off me regarding being all over campus. Your thoughts here have basically clarified some of the aspects of working with them, and mirrors my thoughts on management. My former work associate is now my boss , and part of his mantra is it’s all about me . It was funny when we worked together as a team now that he runs the program, it takes on a bit of a different meaning.Great videos love your site. See you next spring.VA:F [1.9.13_1145](from 0 votes)

Leave a Response

Tim Hinchliffe
The Sociable editor Tim Hinchliffe covers tech and society, with perspectives on public and private policies proposed by governments, unelected globalists, think tanks, big tech companies, defense departments, and intelligence agencies. Previously, Tim was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. These days, he is only responsible for articles he writes and publishes in his own name. [email protected]