Intelligent machines that can parse their surroundings for categorizable information are a staple of science fiction, from benevolent bots like EVE from WALL-E, to killer Terminators, to whatever those Westworld synths turn out to be.
Now, the Computer Human Interaction research group based at the University of St. Andrews in Scotland have produced a fascinating working platform of this time-honored scifi trope.
Called RadarCat (Radar Categorization for Input & Interaction using Soli), the system enables computers to scan and classify entities with Google's Project Soli sensing technology. Some of the results are compiled in a new video released by the St. Andrews team on Thursday, complete with a suitably whimsical musical score.
Soli uses miniature radar to track movements near its sensors, for the purpose of pioneering touchless interactions with electronic devices—waving a hand to turn an application on, for instance.
The St. Andrews team built on this system by introducing Soli to materials like copper, mouse pads, keyboards, or glasses of water, allowing it bank a database of entities that it could instantly recognize through radar sweeps.
"Our novel sensing approach exploits the multi-channel radar signals, emitted from a Project Soli sensor, that are highly characteristic when reflected from everyday objects," the team noted on its website. "Our studies include everyday objects and materials, transparent materials, and different body parts."
Beyond its uncanny object recognition, RadarCat can also be used as a "tangible painting app" in which the sensors pick up brush size and color from external materials and palettes.
It can also be trained to recognize various body parts with which it comes into contact so that it can launch programs it has been wired to associate with those anatomical features. For instance, the team programmed a device to open up food-related applications when held against a user's stomach, or mapping features when held against a pant-leg.
The final application the team outlines for RadarCat is an automatic alert system for beverage refills at restaurants. Naturally, an adorable robot waiter does the honors of delivering the next drink to the researcher testing out this platform.
The team suggested that future incarnations of the sensing technology could be developed to help people with visual impairments, or to sort through materials in recycling plants.
"Once Soli is deployed in products, our RadarCat solution can revolutionise how people interact with a computer, using everyday objects that can be found in the office or home, for new applications and novel types of interaction," said Aaron Quigley, chair of Computer Human Interaction at St. Andrews, according to The Courier.
"Beyond human computer interaction, we can also envisage a wide range of potential applications ranging from navigation and world knowledge to industrial or laboratory process control."
Get six of our favorite Motherboard stories every day by signing up for our newsletter.