Amazon Sells Facial Recogniton Technology To Police Departments

Has the retail giant Amazon crossed an indefensible ethical line by selling its facial recognition program to a police department to potentially spy on all the inhabitants of a big city? The story is likely to be repeated around the world, if it is not already the case.

Back in East Germany at the end of the 80s. In his book on the secret police of one of the most repressive political regimes in history ( Stasi: The Untold Story of the East German Secret Police ), the journalist John Koehler had estimated that during the 40 years of repression of the East German communist regime, the Stasi had woven a significant network of spies.

While the Soviet Union employed 480,000 spies for a population of 280 million (or 1 spy for about 6,000 inhabitants), Koehler estimates that the German Democratic Republic was feeding a spooky network of 1 spy for … 6.5 citizens!

There were even occasional spies among … the teenagers. Some 10,000 young people under the age of 18 appeared on the Stasi lists discovered after the fall of the Berlin Wall.

Hello police!

Thirty years later, some services offered by Amazon could have dreamed of any agent of the defunct East German secret police. The hidden map of this American giant: Amazon Web Services, its business cloud service. AWS, for close friends, represents 32% of market share in its field. Far ahead of number 2, Azure, Microsoft, which caps at 15%.

It was learned this week that Amazon has sold to several police forces its facial recognition program, Rekognition. A name with a “k” straight out of a sci-fi movie, let’s face it. The high-performance software is housed on AWS servers.

The Orlando police allegedly used it together with surveillance cameras installed in the city to track people representing “interest”. The news was also confirmed by one of Amazon’s leaders at a conference held in South Korea a few weeks ago.

The Orlando Police Service has indicated that this is a “pilot project” and that the service has been deployed in compliance with the law. Except that the legal framework surrounding the use of these new technologies is full of holes. I’m not the one who says it: it’s one of the greatest specialists in the field, Clare Garvie, whom I attended at the last South by Southwest festival (SXSW).

In a letter made public Tuesday, the American Union for Civil Liberties (ACLU) and forty human rights organizations call for the immediate termination of business relations between Amazon and the police by respect for fundamental rights. Only that.

One of their arguments? The many possible abuses in the surveillance of citizens from the moment the police has a powerful software and cameras.

“Who will supervise the supervisors? This question, posed by Edward Snowden following the revelations of the National Security Agency’s (NSA) listening program three years ago, comes back to haunt us. For the moment, difficult to answer.

According to the ACLU, the constitutionality of the use of such services has never been studied by the high courts in the United States. Given the lack of legal advice worthy of the name, it seems that Amazon has put a lot of pressure on the police forces to connect not only the surveillance cameras Rekognition, but also the cameras that agents carry on them .

Amazon, however, withdrew mention of the possible use of the algorithm on police cameras as a result of discussions with the ACLU.

As any good researcher in artificial intelligence will tell you, even if the error rates of facial recognition software are increasingly thin, the fact remains that they produce the famous “false positives”. In fact, we learned this winter that the more facial recognition programs were studying sociodemographic groups that were traditionally disadvantaged, the higher the rate of false positives.

A passport for repression?

China is already using facial recognition algorithms provided by Megvii to conduct a vigorous crackdown in the west of the country.

The civil organizations that issued a warning signal this week have rightly pointed out the legal vacuum in which these first “tests” are taking place by the police force of their new robotic toy. And that the algorithmic potential to amplify existing abuses and prejudices is well established.

Will one day have to make up in a precise way not to be identified by facial recognition algorithms that could possibly sprinkle our cities? In 2010, the artist Adam Harvey had imagined hairstyles and make-up intended to make the identification of the faces impossible by the surveillance cameras, a rather pretty and futuristic appearance with the name which recalls the camouflages employed on the boats during the First War world: CV DAZZLE .

But maybe one day, it will also be forbidden to wear makeup. By order of the police.

Recommended For You

Read previous post:
Mercury’s crust is thinner than previously thought

Mathematical calculations have determined that Mercury's crust is thinner than previously thought. The small planet Mercury is the one whose...

Hosted with