On Tuesday, the city of Berkeley became the fourth city in the country to ban its agencies from using facial recognition technology. While the move was largely seen as preventing surveillance overreaches, internal city communications made public reveal that the city had already acquired facial recognition technology.
The ordinance, passed unanimously by Berkeley City Council members, aimed to hold back on a technology still susceptible to errors with little regulatory framework. The ordinance plans to keep the “genie” in the “bottle,” according to District 5 Berkeley City Councilmember Sophie Hahn.
“I think of it really as proactive,” said District 4 Berkeley City Councilmember Kate Harrison, who penned the ban. “I want to make sure cameras aren’t used for unintended purposes.”
But documents obtained by data privacy proponents show that city agencies have already acquired facial recognition capabilities with a pilot camera program.
Brian Hofer, chair and executive director for the nonprofit Secure Justice, was a central architect of the city’s facial recognition ban. A California Public Records Act, or CPRA, request Hofer filed with the city turned up emails from the Berkeley Police Department, or BPD, and the city’s information technology department. The emails present a timeline showing how staff sought out and implemented, and later attempted to exempt from the ban, cameras at San Pablo Park capable of sophisticated data analysis.
A contract for Avigilon security equipment at San Pablo Park shows that the city acquired a combination of equipment and software capable of a feature called “appearance search,” which “sorts through hours of video with ease, to quickly locate a specific person or vehicle of interest.”
City spokesperson Matthai Chakko said that while city staff and BPD discussed acquiring facial recognition technology, the city neither uses facial recognition nor has the ability to do so.
Scrutiny surrounding the city’s use of advanced surveillance technology centers around a camera system at San Pablo Park. The $35,000 security system, including six cameras, was installed in March after an August 2018 daytime drive-by shooting injured three bystanders at the park.
Circumventing an approval process under the city’s surveillance ordinance before the recent ban, the cameras were exempted from a public approval process citing an exception for surveillance equipment on city-owned property.
Described as a “pilot camera program” in a survey administered by city manager Dee Williams-Ridley, the city purchased a camera system from Avigilon products, a large industry player in video analytics, security cameras and video management software.
City staff do not seek out facial recognition technology, according to an email Williams-Ridley sent to council members who sit on the public safety committee. An email thread released through the CPRA, however, documents city interest and plans to obtain a surveillance system with facial recognition capabilities.
“I have been looking into replacing the camera controller for some time now,” wrote Gregory Marwick, the city’s lead communications technician, in an October 2018 email to BPD Lt. Kevin Reece. “This new system is one we are looking into for the parks and other areas around the city. It uses facial recognition and smart AI to track people and incidents.”
Before the contract was secured, a Q&A email thread from December 2018 shows Avigilon representatives educating top-ranking city information technology and security staff on the system’s features. Avigilon indicated that the system builds artificial intelligence data models that reside both on a remote workstation and in the cameras themselves.
Avigilon also clarified to staff in the emails that “we can perform Appearance Search across multiple sites.”
According to Avigilon’s marketing materials, the appearance search feature can cross-reference old footage to help identify an individual and can use a “good quality face image” as “additional information for reinforcing search accuracy and reliability.”
An Avigilon software manual shows that the software version necessary for the appearance search was the same that the city had installed. On March 12, Marwick emailed city information technician Geraldo Guinto requesting that Guinto install the program, the Avigilon Control Center 6 program, on his console.
During the facial recognition ordinance drafting process, information provided to city council members concerning the use of the camera system shifted multiple times. The mix up started when Tom Ray, the city’s chief information security officer, asked Harrison to exempt the San Pablo Park cameras from the ban. Ray said the system is capable of using artificial intelligence, which includes lip reading and voice recognition.
A week later, Savita Chaudhary, director for the city of Berkeley’s information technology department, followed up with Harrison to roll back some of Ray’s points. Chaudhary said the system does not use facial recognition technology but rather machine learning “which provide(s) high-quality video images, which could theoretically be submitted or utilized in conjunction with facial recognition software or services.”
In attempts to clear the air, Williams-Ridley emailed council members on the city’s public safety committee. According to Williams-Ridley’s emails, the city has “not purchased the software and services required for facial recognition” and asserted that city staff was not seeking to use it.
Avigilon was not able to respond to questions about the surveillance system’s capabilities by press time.
Although Chakko asserted that the chosen surveillance system was a response to gun violence at San Pablo Park, released documents show how the city borrowed and used Avigilon equipment almost two weeks prior to the Aug. 18 shooting. Chakko said the decision to install cameras was the product of multiple community meetings held by Councilmember Cheryl Davila and Mayor Jesse Arreguín in the aftermath of the shooting.
BPD borrowed Avigilon cameras from the Northern California Regional Intelligence Center, or NCRIC, in preparation for an anti-Marxism protest Aug. 5. The protest saw Trump supporters pitted against counterprotesters, with tensions running high after a series of violent protests put the city of Berkeley onto a national stage of polarized political conflict.
Mike Sena, executive director for NCRIC, said the organization occasionally loans equipment to its member law enforcement agencies, of which BPD is one. Sena also added that some police departments loan equipment before deciding to acquire it.
NCRIC is a counterterrorism organization that facilitates information sharing for local law enforcement member agencies with federal ones. CPRA documents also show that NCRIC contracts with Palantir, a company that works closely with the U.S. Immigration Customs Enforcement, or ICE, to help with real-time surveillance. Palantir was also the target of recent student protests on campus.
Chakko and Sena both said the equipment loaned to the city does not use facial recognition technology. During the anti-Marxism protest, signs were posted informing protestors that they were being recorded. According to Chakko, BPD did not share any video feeds from the protest with NCRIC.
“We are dealing with an explosive and potentially violent situation where people are coming to commit violence in Berkeley,” Chakko said. “If you put officers in a crowd just to monitor them, the mere presence can escalate tensions greatly, so what we’re looking to do is make sure people are safe by using the least amount of force.”
Oakland Privacy, a citizen’s coalition advocating for privacy rights, submitted a letter to the mayor and council members on Sept. 17 alleging that the lack of disclosure on the San Pablo Park security measures violated the city’s surveillance ordinance. The letter also states that in July, city staff was asked to create a report on the capabilities of the camera system, which still has yet to be received.
Still waiting to hear back on additional public records requests, Hofer said he plans on suing the city of Berkeley.
“I think the San Pablo Park cameras are and were a step in the direction of normalizing AI surveillance in public spaces and the commons without any individualized suspicion of wrongdoing and without consent or notification to the public,” said Tracy Rosenberg, a member of Oakland Privacy, in an email.