The new documentary is an essential introduction to algorithmic bias—and the systems that gave rise to it.
Groups like the ACLU are pushing a program to rein in surveillance tech. But all it does is expand and strengthen the police's surveillance powers.
Police across Canada are increasingly adopting algorithmic technology to predict crime. The authors of a new report say human rights are threatened by the practice.
Critics say that predictive models will lead to false positives and could disproportionately affect vulnerable communities.
Motherboard is publishing several hundred pages of documents we obtained from police departments by using Freedom of Information requests.
PredPol uses an algorithm based on earthquake prediction to “predict crime.” Academics say it’s simplistic and harmful.
According to documents obtained by Motherboard, the use of the predictive policing software PredPol is far more widespread than previously reported.
Documents obtained by Motherboard using public information requests verify previously unconfirmed police department contracts with predictive policing company PredPol.
Predpol, a tool that police departments use to algorithmically predict crime, quietly created login portals for police departments in at least seventeen US communities.
Hartford is embracing a sophisticated surveillance apparatus that some civil liberties advocates and residents fear marks an ominous trend.
Crime-predicting software PredPol perpetuates discrimination, just like the discredited “broken windows” policing strategy, say digital rights advocates.
The bill is believed to be the first in the country to push for open sourcing of the algorithms used by courts, police, and city agencies.