Boston isn’t currently using facial recognition technology.
And the city council is looking to keep it that way for the foreseeable future.
During a video conference hearing Wednesday afternoon, Councilors Michelle Wu and Ricardo Arroyo introduced an ordinance to ban the city and its police department from using any face surveillance system — or information derived from such technology — due to concerns about racial bias and the infringement of civil liberties.
“Technology can supplement efforts for safety and public health, but ultimately it has to be built on a foundation of trust,” Wu said during the hearing.
The proposal follows similar bans approved over the past year in Somerville, Cambridge, Brookline, and Springfield. And as Wu noted, it comes at a time when governments around the world increasingly turn to contact-tracing technology in the response to the coronavirus; Russia has even used facial recognition cameras to help enforce its shutdown order.
Of course, American cities aren’t there yet. But as such technology becomes more pervasive in everyday life, Boston city councilors — nearly all of whom asked to sign onto the ordinance Wednesday — and civil liberties advocates are concerned that face surveillance systems are particularly flawed and prone to misuse in the absence of government regulation.
“This is an issue that has been raised in our sister municipalities,” Wu told Boston.com in an interview. “So we’re looking to make sure Boston stands with our region.”
San Francisco became the first major American city to ban facial surveillance last May, and the Massachusetts chapter of the American Civil Liberties Union has campaigned for a statewide moratorium on the government use of facial recognition technology until additional legislation has been passed to protect privacy rights and implement standards around its use.
During the hearing, Arroyo cited the growing body of evidence that the technology is less accurate in identifying women and minorities, including research out of MIT that found that some programs misidentified up to 35 percent of darker-skinned women. A study released this past December by the National Institute of Standards and Technology found that the majority of commercial facial recognition programs exhibit bias, falsely identifying African-American and Asian faces at rates that are multitudes higher than the error rate for white faces.
“It furthers inequity,” said Arroyo, who noted that COVID-19 is already hitting communities of color disproportionately hard.
The ordinance would make it illegal for city officials to use facial recognition technology for surveillance themselves, or to sign a contract with any third parties for those means (the ban does not include face recognition technology, like Face ID, on smartphones and tablets that is solely used for user authentication).
Wu said that Boston police had already committed to not using facial recognition, at least until it is proven to be more accurate.
According to the ACLU, the police department currently has a contract with the private company BriefCam for surveillance analytics software. And while BriefCam 4.3, the current software used by Boston police, does not include face surveillance features, the current contract expires next Thursday, May 14. And an upgraded version of the software, BriefCam 5.3, does offer facial recognition technology that allows officials to track individuals and set “watchlist” alerts for certain faces.
Wu says they’re hoping to schedule a hearing on the face surveillance ordinance sometime this month.
“We’re on the same page with Boston police leadership and the commissioner in agreeing that Boston should not be using discriminatory technology,” she said. “This provision would just codify that.”
While there was no opposition to the ordinance during Wednesday’s hearing, Councilor Michael Flaherty — who also asked to have his name added to the ordinance in support — did say that facial recognition technology is something the city should consider, if its accuracy is improved. Flaherty noted that general video surveillance had been a “useful tool” in solving certain homicides and other crimes, referring to the abduction of a Boston woman last year.
While he was “happy” that the city wasn’t using the technology given its imperfections, Flaherty suggested it could be something the city eventually considers.
“In the event that it does get perfected, in the interest of public safety, we’re going to have to take a long hard look at it,” he said.
Wu, Arroyo, and Council President Kim Janey also introduced a separate, related ordinance Wednesday that would give the city council approval and oversight authority over any new municipal surveillance technologies. However, they feel the face surveillance ban should be implemented first.
“We just want to make sure that there’s proactive action taken now to require that there would be a ban, and then continue with the larger conversation about surveillance oversight overall,” Wu said.