So, you're on a first date. You really like this person, and you have no idea whether or not they're into you. As you casually pretend to check your phone, you record some video of your date (non-consensually—don't actually do this), and receive an instant message that describes how they're feeling, even if they're trying to hide it: pleasure, excitement. Nice.
This is the future one Toronto start-up, NuraLogix, is proposing. Their image processing software, called Transdermal Optical Imaging, claims to decode hidden emotions, and to double as a lie detector. The mind behind the project is University of Toronto developmental neuroscientist Kang Lee, whose twenty-plus years in the field have focused on human face processing and the science of deception.
According to Lee, NuraLogix's tech works by detecting changes to blood flow in the face. The idea is that basic human emotions create specific facial blood flow patterns that are beyond our control. These patterns differ when we're telling the truth, or a lie, according to him.
The software operates by measuring hemoglobin concentration, a component of blood. Previous work on the topic found a relationship between emotions and blood flow to regions like the cheeks and forehead. Those differences in blood flow can be picked up based on changes in skin colour—basically, the redder your skin, the greater the blood flow. (This has been tested among people of different races and skin hues.) A 2015 study found that an angry emotional state was associated with increased blood flow and redness, compared to a neutral emotional state. Sadness was the opposite, and revealed decreased blood flow and redness compared to neutral.
As for lie detection, psychologists have a long history of researching markers of deception, from verbal cues to facial microexpressions.
Lee's team built on this research by developing machine learning algorithms to monitor changes in blood flow over time from videos of human faces. Using footage from any conventional camera source—including your smartphone—the software analyses changes in skin colour to track emotions. It then compares those images to standardized data on emotional blood-flow signatures. The result is kind of like seeing the world through Predator's eyes.
The applications range from first dates to law enforcement, marketing, and education, he told me. "It could be very useful, for example, for teachers," said Lee, whose academic lab is based at the Ontario Institute for Studies in Education. "A lot of our students have math anxiety but they do not want to tell us, because that's embarrassing." If teachers were able to identify students experiencing anxiety in classrooms, they could help address those students' needs individually, and early on.
Other researchers have attempted to decode emotions and detect deception by measuring patterns in facial temperature with thermal imaging cameras, but the results haven't been as promising.
"Our method turns out to be very useful, and we discovered it by sheer luck," said Lee. "By having this method, now we can do it non-invasively, and remotely, and sometimes it can be covertly."
But it's hard to tell how accurate technologies like this one can be, and how readily we can apply lab results to real world situations. (How to use these technologies ethically, and whether that's even possible, should be fodder for future conversations.) In scenarios like border control or law enforcement, will the results hold water?
This isn't the first time neuroscience has claimed to offer a 21st century polygraph, although lie detector tests have not been proven to work on an individual basis. In 2008, lie detection via electroencephalography was used as evidence in an Indian court to convict a woman of murder (to the horror of scientists who'd tested the technology on college students committing mock crimes). As recently as this year, one tech company proposed eye-tracking tech for detecting terrorists among groups of refugees.
Lee assured me that NuraLogix's tech wouldn't pass muster in a court of law. "They want the accuracy to be extremely high, like genetic tests, so a one-in-a-million error rate. Our technique won't be able to achieve an extremely high accuracy level, so because of that I don't think it's useful for the courts."
Lee projects that NuraLogix will have its first usable iteration in the next six months, including software stored in the Cloud that users can access to process video from their device. "In about 12 months we'll have something more sophisticated," said Lee. "For a cell phone app, it will be a few years."
Just waiting for the in-app Tinder feature now.