Amazon’s facial recognition technology mistakenly matched Duron Harmon with a mugshot — and he’s not the only one

"This technology should not be used by the government without protections."

New England Patriots safety Duron Harmon leaves the field after a game against the New York Giants earlier this month in Foxborough. Elise Amendola / AP

Playing for the New England Patriots and winning not one, not two, but three Super Bowls brings with it a certain degree of fame and recognition.

Yet, apparently not enough for Amazon’s facial recognition software.

In a test conducted by the ACLU of Massachusetts, Patriots safety Duron Harmon was among more than two-dozen professional New England athletes falsely matched to individuals in a mugshot database by Amazon’s controversial cloud-based Rekognition program — and he’s speaking out in support of a proposed moratorium on government agencies in Massachusetts using facial recognition products.

“This technology is flawed,” Harmon said in a statement Monday.


“If it misidentified me, my teammates, and other professional athletes in an experiment, imagine the real-life impact of false matches,” he said. “This technology should not be used by the government without protections.”

Harmon, who has spoken out before on education reform and social justice issues, was arguably not even the most high-profile athlete mistakenly identified as a criminal suspect by Rekognition.

The ACLU says it compared the official headshots of 188 local sports pros with a database of 20,000 public arrest photos. In the test, a total of 27 — nearly one out of every six — were falsely matched with a mugot — from Harmon’s fellow Patriots teammates, like center David Andrews and kicker Stephen Gostkowski, to Bruins forward Brad Marchand to Red Sox ace Chris Sale to Celtics teammates Tacko Fall and Gordon Hayward.

The civil liberties group released the results of the experiment as Amazon and others face increasing resistance to facial recognition technology, which the tech giant has marketed to local and federal law enforcement agencies — including Immigration and Customs Enforcement. In June, Somerville became the first community on the East Coast to pass a facial recognition ban.

Massachusetts legislators are scheduled to hold a State House hearing Tuesday afternoon on a bill sponsored by Sen. Cynthia Stone Creem and Rep. David Rogers to implement a statewide moratorium on facial recognition and emerging biometric surveillance programs.


Proponents are concerned that the technology poses civil liberties threats, particularly for minorities. In January, MIT researchers found that Rekognition had much more difficulty identifying female and darker-skinned faces in photos than similar programs, adding to a growing body of evidence of how racial bias has seeped into the developing technology.

However, the ACLU says that, even when it does work correctly, facial recognition technology, which remains largely unregulated, can violate individuals’ due process rights if misused by law enforcement. The group’s Massachusetts affiliate has launched a statewide “Press Pause on Face Surveillance” campaign calling for at least a temporary ban.

“There are currently no rules or standards in place in our state to ensure the technology isn’t misused or abused,” Kade Crockford, the director of the Technology for Liberty Program at the ACLU of Massachusetts, said in a statement. “Massachusetts must pass a moratorium on government use of face surveillance technology until there are safeguards in place to keep people safe and free.”

Amazon argues the technology offers a number of benefits, such as helping to find and reunite a missing child with their parents or identify and rescue victims of human trafficking. The tech company has also objected to the ACLU’s methods.


In July, the ACLU ran a similar test that found Rekognition incorrectly matched 28 members — or about 5 percent — of Congress, including Massachusetts Sen. Ed Markey, with mugshots. And another test by the group’s Northern California arm in August found the program did the same to more than one in five California state lawmakers.

In all three tests, the ACLU used Rekognition’s default 80 percent similarity threshold, meaning that the probabilistic program is at least 80 percent confident that it has found a facial match. However, Amazon recommends that law enforcement and public safety agencies use Rekognition with a 99 percent confidence threshold and do so in combination with human judgement.

“This is often a key first step to help narrow the field and allow humans to expeditiously review and consider options using their judgment,” says the company’s website.

Still, without any regulations, the ACLU says there’s nothing enforcing appropriate use.

“We use the technology as provided to police departments, with the default threshold setting,” Crockford told Boston.com in a statement. “Amazon could require a 99% threshold as a technical matter if they cared to. They could write it into the code, but they’ve chosen not to. There is no guarantee that a police agency would use the stricter setting, and there is no law that requires this.”


This discussion has ended. Please join elsewhere on Boston.com