As a concept, autonomous behavior analysis makes facial and body recognition seem quaint. Think about it: in the latter situation, the machine merely identifies you as you or at least some database version of you, however accurate. In a situation where the computer is analyzing your actions/movements, it’s making a deeper and arguably more dangerous determination. In ascribing behavior to you, the program is doing more than recalling information, but creating information. The software is adding to the profile. This is the difference between profiling and profile-creating, the opening of an entire new world of fiction.
Autonomous behavior analysis is a real thing, of course. The Center for Engineering and Industrial Development just unveiled a recently installed prototype in central Mexico. The CIDESI test system consists of four cameras installed around a laboratory., all connected to a central server. Basically, those cameras together track an individual as they enter the facility and move within it while making determinations as to whether that person is behaving “unusually.” If the software determines they are, the event is logged and it sends out a notification.
Videos by VICE
“In CIDESI’s design, all surveillance cameras work as a data gathering point and decide whether something is relevant or not; when one of the cameras detects something unusual it sends a signal, focuses, and analyzes the event,” says Hugo Jiménez Hernández, the Center’s head of research. “When uncommon things happen the response time is very fast and the number of events it detects increases.” The advantages this might have over a person, beyond just not needing to pay a person, should be self-evident: the ability to interpret many situations at once, do it without any “extra” bias, and do it very, very fast in a central location.
It’s not an especially crazy idea. This is the sort of task artificial neural networks were created for: learning patterns and sequences, and either classifying new inputs based on that or identifying true deviations (“novelty detection”). Way down deep, it’s just statistics. Math problems.
The CIDESI press release is modest about applying the technology, suggesting that such a system might be used at traffic intersections to identify accidents and such. But the questions are endless: could AI-sourced behavior analysis be used as evidence in court? Are people going to start getting picked off in airports because some software determines they started walking funny in the security line? Just imagine how deep a behavior database might get, every month or year enabling some deeper and more long-term look. I suppose the big moment would come when the computer gets better at detecting behavior than a person. That seems not just logical, but inevitable.
More
From VICE
-

Screenshot: Sony -

-

Jim Dyson/Getty Images
