At a factory in Hangzhou, China, production line workers are allegedly being outfitted with brain-reading hats and helmets. They read workers’ emotions and use artificial intelligence algorithms to “detect emotional spikes such as depression, anxiety or rage,” according to the South China Morning Post.
Hangzhou Zhongheng Electric is one of many Chinese companies using government-sponsored technologies to monitor the brain activity of employees while they’re on the job, the Post reports.
MIT Technology Review notes that the details in the story (or lack thereof) raise suspicions: What we can reliably detect from over-the-skin EEG sensors about human emotions is still fairly unclear, and there’s scant possibility that one company gathered enough information to contribute to millions of dollars in profits, as the Post claims one power company spokesperson said:
"The technology is also in use at in Hangzhou at State Grid Zhejiang Electric Power, where it has boosted company profits by about 2 billion yuan (US$315 million) since it was rolled out in 2014, according to Cheng Jingzhou, an official overseeing the company’s emotional surveillance programme."
But the basic dystopian conceit here—that your employer would force you to wear a mind-reading device so it could increase productivity—is not impossible. When smartwatches were booming, one startup was devoted to making a smartwatch app that bosses could use to track employees’ productivity. In February, GeekWire spotted an Amazon patent for smart wristbands to be given to warehouse employees that would track where workers’ hands were and send radio signals between the wristband and the inventory item.
“The type of invasive, ongoing surveillance described in the [South China Morning Post] article is probably more likely to damage morale than to improve it, since it undermines employees' autonomy and basic human dignity,” Natasha Duarte, policy analyst at the Center for Democracy and Technology, told me in an email.”
It’s part of a trend toward “fetishizing data and analytics” to optimize our lives, in terms of workplace productivity especially, she said. The problem with this, according to Duarte, is that there’s almost always a power imbalance between those designing the experiments and the subjects whose behavior is being targeted for monitoring or change—in this case, the employer and the production line workers.
“These data analytics programs are often targeted at populations such as people who use social services, students, workers, people who have been subject to law enforcement or criminal justice systems; and the systems are almost never designed or evaluated by those groups,” Duarte said.
Whatever changes an employer makes to the workforce based on behavioral data might be overthinking problems that a much simple solution—like better benefits, paid leave, or increased wages or flexibility—might fix. “Data analytics should not take the place of thoughtful policy making in the workplace,” Duarte said.