In Chicago, a government watchdog says Shotspotter alerts have only led to evidence of gun crime in 9 percent of cases.
Experts say medical images like X-Rays and CT scans allow algorithms to determine a patient's race—and warn it could lead to bias and discrimination.
Experts say ShotSpotter is unreliable and disproportionately calls armed police into Black and brown neighborhoods.
Prosecutors in Chicago are being forced to withdraw evidence generated by the technology, which led to the police killing of 13-year-old Adam Toledo earlier this year.
A Motherboard investigation found that ShotSpotter frequently generates false alerts—and it's deployed almost exclusively in non-white neighborhoods.
Republicans want to stop discussions about racism by filling classrooms with cameras. We're already halfway there.
Using a new technique, researchers say they can make AI systems misidentify people by adding small bits of data to the images.
Clerks at 7-Eleven and other convenience stores are being constantly monitored by a voice of god that can intervene from thousands of miles away.
ID.me's CEO says unemployment fraud is costing taxpayers $400 billion, but his own company is denying claims because of problems with its tech, users say.
Local police used $150,000 in COVID relief funds to purchase Boston Dynamics' four-legged robot, Spot.