Defense of Amazon's face recognition tool undermined by its only known police client

By Bryan Menegus

Faced with two independent studies that found its facial recognition software returns inaccurate or biased results, Amazon has repeatedly claimed that the researchers failed to use the software, called Rekognition, in the way the company has instructed police to use it.

However, the only law enforcement agency Amazon has acknowledged as a client says it also does not use Rekognition in the way Amazon claims it recommends, Gizmodo has learned. In doing so, the law enforcement agency undermines the very argument Amazon uses to discredit critical research about Rekognition.

Rekognition is a software suite which purports to serve various functions ranging from identifying specific facial features to comparing similarities in a large volume of photos. This gives the software potential value to law enforcement customers seeking, for example, to match a mugshot to security cam footage of a robbery. Critics are concerned that automating the policing process could have dire consequences if the software displays racial or other biases, especially as Amazon has pitched Immigration and Customs Enforcement on using this product. So far Amazon’s frontline defense against advocates and academics has been: Researchers are setting the search parameters incorrectly.

Related Content