AI experts ask not to study algorithms to predict crimes

Must Read

WhatsApp activates a new function to “verify” messages forwarded through the browser

For some time we have been able to forward messages on WhatsApp. Similarly, every time we receive a forwarded message, the application indicates that...

WhatsApp on various devices: the beta already shows where you can manage them

There are many months that we have been waiting for that long-awaited function of allow us to use WhatsApp on more devices than just...

You can already detect ‘fake news’ through WhatsApp, do you know how?

Since the pandemic started and we were all confined in mid-March, There have been many news that have reached the media talking about the...

WhatsApp will no longer let you take screenshots in chats

This new feature will come to WhatsApp very soon, where it will now be impossible to take screenshots in conversations and have more privacy. This...
Brian Adam
Professional Blogger, V logger, traveler and explorer of new horizons.

AI experts ask not to study algorithms to predict crimes

The use of artificial intelligence and algorithms to predict citizens’ crime, not having adequate scientific literature at the base, is controversial to many AI experts. For this, a coalition of researchers has asked academia to stop publishing studies in favour of this technology.

This coalition called the Coalition for Critical Technology has called into question colleagues doing research to convince them to conclude such studies forever. In fact, the area of ​​crime and the algorithms that try to predict it is often subject to biases that end up treating various social groups differently based on skin colour or other prejudices.

As written by the coalition in one letter on Medium signed by 1700 experts, “there is no way to develop a system capable of predicting crime without this mechanism being subject to bias, precisely because the notion of crime is naturally subject to prejudice. Research in this area cannot be neutral. “

This letter was written following the announcement by the Springer publishing house, the world’s largest publisher of academic books, to also publish studies on this sector. Among these, the coalition’s interest is A Deep Neural Network Model to Predict Criminality Using Image Processing, where several researchers claim they can create an algorithm for facial recognition able to predict whether someone is a criminal or not with accuracy up to 80% and without any bias.

Following the request of the Coalition for Critical Technology, Springer would have decided to stop publishing this research, which once subjected to peer review would have been unreliable. The coalition’s choice to act during the Black Lives Matter protests, which also involved companies like Google, is probably random. The media coverage, however, thanks to these facts could be greater.