(TNS) — San Francisco police investigators may have circumvented the city’s ban on facial recognition technology by building a gun case, in part, on facial recognition software used by another law enforcement agency, according to interviews and documents obtained by The Chronicle.
The revelation has raised serious questions among city officials about whether the Police Department bypassed a city law intended to curb the use of certain surveillance technologies by law enforcement and most other city agencies.
San Francisco District Attorney Chesa Boudin said he would consider dismissing the case if police can’t prove they could have identified the suspect through means other than facial recognition.
Lee Hepner, a legislative aide to Supervisor Aaron Peskin, who authored the ban, said Peskin was “troubled by what appears to be a violation of the ordinance we passed last year.” When the ordinance took effect in May 2019, only the city’s port and airport — which are federally regulated — were exempted from the ban. “We’re going to be looking into this with the city attorney’s office, and we’ll be taking every necessary step to ensure the city ordinance is complied with,” Hepner said.
The case involves an alleged illegal gun discharge earlier this month, according to the suspect’s defense attorney, Ean Vizzi. Evidence of the use of facial recognition technology came to light this week when Vizzi received information through the discovery process.
San Francisco police Sgt. Michael Andraychak said a photo taken from a San Francisco surveillance camera was used in a crime alert bulletin that asked neighboring agencies for assistance locating the suspect.
One of the agencies that received the alert was the Northern California Regional Intelligence Center, or NCRIC, a multi-jurisdictional law enforcement agency that conducts crime analyses and major investigations and coordinates counterterrorism and drug enforcement programs.
NCRIC, which uses facial recognition technology, ran the photo through its software, produced a match from its database and sent it to San Francisco police, Andraychak said.
“We didn’t use or request NCRIC,” Andraychak said. “They did this on their own.”
Andraychak said, however, that officers had already identified the suspect through other means before receiving the NCRIC results.
Andraychak said it’s common for officers from San Francisco and other local agencies to send out crime alert bulletins. He could not provide information about how frequently responses to those bulletins included facial recognition software results from other agencies since the ban on such technology last year.
The documents in the discovery materials provided to Vizzi included three pages of records or slides suggesting the use of facial recognition in the case: The first appears to be a cover page titled “Facial Recognition Results,” and below, “San Francisco Police Dept.,” the case number and “assault with firearm investigation.”
The document or slide goes on to say that NCRIC’s system compares suspect photographs against more than 500,000 photographs in databases “used by law enforcement agencies in San Mateo County.”
Another page or slide contains what appears to be a still of the suspect from a surveillance video, next to a mug shot of the defendant. In copies provided to The Chronicle, faces were redacted on both photos.
In an interview with The Chronicle, Vizzi said he has handled numerous criminal cases in which officers have been able to personally identify a suspect from photos or surveillance footage. Vizzi said he has always suspected the use of facial recognition software.
“It’s too convenient that police officers are so good at identifying photographs from surveillance videos. ... (I thought) there has to be something else going on there,” he said. “Lo and behold in this case, I see there’s something going on there.”
Vizzi said the alleged gun discharge incident occurred Sept. 9, the photographs were matched on Sept. 10 and his client was arrested Sept. 11.
Facial recognition has been credited for leading police across the country to wanted suspects.
But the technology has come under increased scrutiny from privacy advocates and others, many who fear a profound erosion of civil liberties. Facial recognition software and related technologies have been blamed for inaccuracies, particularly in attempting to identify minorities.
Oakland imposed its own ban on the technology shortly after San Francisco, citing bias concerns.
Mike Sena, executive director of NCRIC, said employees commonly used facial recognition technology after receiving requests from law enforcement or viewing an all-points bulletin. The software can match images pulled from security cameras or other sources to databases of mug shots.
The database does not use driver’s license photos or any other types of photos, Sena said.
Sena said he couldn’t provide information on specific agencies or cases, but said an employee runs facial recognition searches from bulletins “quite often, (when) someone needs quick response.”
Boudin said in order to successfully prosecute cases, his office needs to “rely on the integrity of the investigations and the evidence that we are presented.”
“San Francisco prohibits the use of facial recognition software, with very narrow exceptions,” he said. “We cannot rely on evidence gathered in violation of that law.”
San Francisco Police Commissioner John Hamasaki said he will request that Police Department officials investigate to determine the scope of the problem, address any misconduct and ensure the violation won’t happen again.
“While identifying suspects is an important law enforcement concern, bending or breaking the rules to do so puts the entire investigation and prosecution in jeopardy,” he said. “Police cannot simply choose which laws to follow (and) which to ignore and still retain credibility with the communities they serve.”
©2020 the San Francisco Chronicle. Distributed by Tribune Content Agency, LLC.