Students Modified This Chip to Give Smartphones The Touch Sense

Project Soli, which debuted at Google I/O in 2015, is a tiny chip that uses radar to detect discreet hand and finger motions. It was designed as a unique way to interact with mobile devices, but students at the University of St. Andrews found a way to use the simple chip to give electronics an actual touch sense.

The chip, developed by Google’s Advanced Technologies And Projects group, or ATAP, uses the same kind of radar as airports use to track arriving and departing planes. As radio waves bounce back to the Project Soli chip from your hand, the unique signals detected can be used to decipher even the tiniest of motions.

But the computer science students at St Andrews, including Hui-Shyong Yeo, Gergely Flamich, Patrick Schrempf, David Harris-Birtill and Aaron Quigley, discovered that different materials produced unique signals too, and through the use of machine learning, eventually a computer can be given the ability to determine what the Project Soli chip is touching.

The research, called RadarCat, isn’t only limited to just figuring out what an object is made from—be it metal, plastic, or wood. In the video demonstration we see the RadarCat software correctly identify an empty drinking glass, but then also recognize when that same glass is being filled with water. It might not always make an accurate prediction when faced with a new material, but the use of machine learning means it will get better over time through future interactions.
giphy
While the Project Soli hardware isn’t quite small enough to squeeze inside of a smartphone as of yet, there are still useful applications for RadarCat in its current form. For example, instead of relying on only high-speed cameras to sort waste in a recycling facility, machines could actually feel an object to determine what it’s made of, and how it should be properly dealt with.

The potential for improving how robots interact with the world using this technology is also very exciting. They could immediately know when they’re touching human skin, and need to be extra gentle to prevent injuries. Or if they’ve grabbed a metal object that is probably quite heavy, and to brace for the weight if they try and lift it.

The RadarCat technology will be demonstrated at the upcoming 2016 ACM Symposium on User Interface Software and Technology conference being held in Tokyo, Japan, from October 16 to the 19.