Surveillance The US police rely on facial recognition – now Amazon and IBM are getting cold feet 06/11/2020

Must Read

Google launches app that will pay you to do homework

Google's developments do not stop growing and its ecosystem of services and applications are becoming increasingly powerful and useful. An example...

Eleven, a tool to encrypt passwords and send them safely

Passwords as guardians of our information, should not be shared with anyone and less through services that do not guarantee security. ...

WhatsApp: what happens if you choose the ‘change number’ tool

Are you to use WhatsApp for all types of communication? Most of the people changed the traditional way of speaking, since they...

Monitor Amazon Product Prices with Pricy

Even if we have a defined budget to buy something, it will always be much better to get it at the lowest...

Recognize every suspect immediately and follow them every step of the way: Face recognition is extremely attractive for police authorities around the world. But just some providers are now taking the criticism of the technology to heart – and pulling the emergency brake.

A person in a crowd looks around suspiciously, looks nervous. A camera immediately reports this to the security forces and gives the name of the potential suspect. What was science fiction a few years ago has been expanded in recent years. Tech companies like Amazon also sold their facial recognition systems to security agencies, and since last year it has even been able to recognize emotions. However, given the protests in the USA, some manufacturers are getting scruples. For good reason: The algorithms don’t treat everyone equally.

“Now is the time to begin a national dialogue as to whether and how facial recognition should be used by law enforcement agencies,” said IBM CEO Arvind Krishna in a letter to the US Congress on Tuesday. The occasion was the numerous protests against police violence. Because not only many US police officers prefer to target dark-skinned people. The face recognition programs also have clear prejudices. Computer veteran IBM draws the strongest possible conclusion from this: In the future they will no longer offer any facial recognition programs and strictly reject the use of the programs by the police, explained Krishna.

Racist Algorithms

The problem has been known for a long time. Face recognition programs now work extremely well – but especially when they are confronted with fair-skinned men. This was shown by several detailed studies. The success rate dropped in people with darker skin as well as in female test subjects. The algorithms performed worst among dark-skinned women. While fair-skinned men were correctly identified in a study by several facial recognition programs with a rate of up to 100 percent, the AI ​​was wrong for every third woman with dark skin. What would be an annoying mistake in the software lab quickly becomes a serious problem when used by law enforcement agencies and private security services.

The subconscious prejudices of software developers are assumed to be the reason for the high error rate. Even if the proportion of women and non-whites is now higher than it used to be, young, fair-skinned men are still the predominant social group in Silicon Valley. At least indirectly, this also plays a role in the development of artificial intelligence such as face recognition.

The programs train their skills by repeatedly comparing the same gigantic data sets with one another. If social groups are underrepresented in the image databases, for example because the developers use their own photos, this is also reflected in the recognition rate. That’s why IBM tried to minimize the blind spots in AI training at the beginning of 2019. The group was the first company to create two photo databases in which all ethnicities, genders and age groups were represented in equal numbers. The databases were also made available to other companies free of charge.

Amazon is taking a break

Another big player has now announced its – temporary – withdrawal from the business with faces. Amazon’s controversial face recognition Rekognition will not be available for use by law enforcement agencies for a year, the company announced on Wednesday. Civilian purposes such as a program to find missing children can continue to use the program. Unlike at IBM, the retail giant in his announcement post but no relation to the current protests. One sees movement in Congress to discuss an ethical use of the technology. These efforts are supported, so the group.

So far, the US government has not been able to bring itself to regulate the use of the technology by security forces. Because of the protests, the subject had gained momentum in the past few days, and it is nothing new in US politics. The civil rights organization ACLU had already resorted to a clever trick in 2018 to put pressure on the MPs. She had the 535 members of the US Parliament compared with a database of 25,000 wanted photos – and found 28 hits among the high-ranking politicians. Most of the MPs who were wrongly identified as criminals were dark-skinned.

Follow us on Google News