Boston Police Commissioner William Gross says his department does not currently use facial recognition technology, and he plans to keep it that way.
At least for now.
During a City Council hearing Tuesday afternoon, Gross said he supports an ordinance to ban Boston police and other city officials from using facial recognition technology, due to concerns about privacy and evidence that the currently available programs misidentify people of color at an increased rate.
“Until this technology is 100 percent, I’m not interested in it,” Gross said.
However, city councilors aren’t waiting for that to change.
The ordinance, which was first introduced last month by Councilors Michelle Wu and Ricardo Arroyo, would make Boston the largest U.S. city east of San Francisco to ban the use of the currently unregulated technology until additional safeguards have been passed at either the municipal, state, or federal level. The proposed moratorium follows similar bans in Somerville, Brookline, Cambridge, Northampton, and Springfield.
It also comes at a particularly “urgent” time for the city of Boston, according to Kade Crockford, the director of the Technology for Liberty Program at the ACLU of Massachusetts, which is campaigning for a statewide moratorium on facial recognition technology.
“In a free society, we should not be subject to constant government tracking and cataloguing of our every movement, habit, and association,” Crockford said. “At its logical conclusion, that is exactly the threat.”
Last month, the Boston Police Department’s contract with the surveillance software company BriefCam expired. And while the current software used by Boston police does not include face surveillance features, an upgraded version does — including tracking technology and “watchlist” alerts for certain faces.
According to Crockford, the department has not yet responded to a public records request for the latest update on the contract.
“Frankly, there’s no official directive to prevent it from being used,” Arroyo said during a conference call with reporters.
Gross said that he anticipates the department will update its BriefCam software, but — even without a ban — would not use face recognition components. The commissioner did express interest in eventually using facial recognition for police investigations, but said only after improvements were made to make it “more reliable” and a “rich dialogue” with the community about regulations.
Wu and Arroyo said their ordinance, which received unanimous support from fellow councilors Tuesday, ensures that conversation will happen. Last month, the two councilors, along with City Council President Kim Janey, also filed an ordinance to create a regulatory process for the adoption of any surveillance technology by a city department.
“What we’re doing here at the city level is drawing a firm line in the sand to say this technology is not going to creep into government use in Boston without democratic debate and oversight,” Crockford told reporters.
During the hearing, Gross requested a working session with city councilors within the next 60 days to refine the ordinance’s language, noting that “video has proven to be one of the most effective tools for collecting evidence of criminal offenses, solving crimes, and locating missing and exploited individuals.”
And while Gross said the department has “no desire to employ a facial surveillance system to generally surveil the citizens of Boston,” he did draw a distinction between general surveillance systems and using facial recognition technology for specific investigations. Gross said facial recognition could “greatly reduce the hours necessary to review video evidence,” allow investigators to move more quickly, and generally make the department more efficient.
“Some areas we consider this technology may be beneficial are in identifying the route and locations of missing persons, including those suffering from Alzheimer’s [disease] and dementia, as well as missing children, kidnapped individuals, human trafficking victims, and suspects of assaults and shootings,” Gross said.
Still, he stressed that the department wanted to work with the City Council and community members to “clearly articulate the language that would best describe these uses.”
“We have to make sure everybody is comfortable with this type of technology,” Gross said.
Sam Ormsby, a spokeswoman for Boston Mayor Marty Walsh, told Boston.com that the mayor “shares the same concerns as the community around privacy,” but also wants to leave room for limited use by law enforcement.
“He agrees with Boston Police that it’s important we have a broader conversation to ensure that this ordinance does not prevent the Department from using any tools in the future that would be helpful in their investigations, such as locating missing persons with Alzheimer’s disease, kidnapped individuals or human trafficking victims,” Ormsby said.
Crockford even acknowledged that there could be possible uses for facial recognition that would be acceptable in “limited circumstances subject to a very high level of judicial authorization” and regulation at the state or federal level. But the reality, she added, is the technology remains “entirely unregulated.”
“The Boston Police Department could take a photo of a group at a Black Lives Matter protest or the Women’s March and run that image through a facial recognition database to try to identify everyone who was protesting police violence or standing up against the Trump administration,” Crockford said.
The City Council’s ordinance would not affect the use of facial recognition by the FBI. Though the federal agency has said it requires agents to take additional steps to confirm matches, the technology has yielded ruinous results when misidentifications occur. It also failed to identify the two perpetrators of the Boston Marathon bombing in 2013.
Additionally, mounting evidence has shown that the current facial recognition technology has particular trouble identifying people with darker skin. During the hearing, the council also heard concerns from immigrant, student, and public health advocates that the use of the technology would exacerbate existing racial inequities.
Joy Buolamwini, a researcher for MIT, testified Tuesday that her studies of commercial technologies released by Microsoft, Amazon, and IBM (which announced Tuesday that it would get out of the face recognition business) found that error rates, which were no more than 1 percent for lighter skinned men, soared to more than 30 percent for darker skinned women. Amazon also announced Wednesday afternoon that it was placing a one-year moratorium on police use of its facial recognition technology, in the hopes of giving Congress “enough time to implement appropriate rules.”
“I never forget that I’m African American, and I could be misidentified as well,” Gross said.