Facial Recognition Tech: Challenges for Criminal Defense & Civil Liberties

Facial Recognition

The Board of Supervisors in San Francisco broke modern surveillance ground a couple of weeks ago by banning facial recognition technology from use by city departments, including law enforcement. The city regulations additionally require all city departments to disclose what surveillance methods they currently use, and to get approval from the Board of Supervisors for future technologies that gather or store a person’s data.

The regulation in San Francisco is part of a larger body of legislation there aimed at giving the city control over the use of surveillance tech and use of personal data, including license-plate readers and the use of information gathered by third-parties that use facial recognition technology.

The San Francisco International Airport and the Port of San Francisco are under federal jurisdiction so the ban does not apply there, and facial recognition is currently in use at those locations within the city. Some airlines are also routinely using facial surveillance systems when passengers board planes.

Unlike many major law enforcement agencies, San Francisco police had not deployed facial recognition, but this action would prevent them from deploying it in the future without approval by its Board of Supervisors. The cities of Los Angeles and San Diego are already using facial recognition technologies for law enforcement investigations.

Facial recognition is on the “hot wish list” for local and federal law enforcement agencies, but these systems currenly pose challenges to civil liberties and rights. The technology is completely unregulated at the moment, most official oversight agencies have no policies about its use, and there is virtually no transparency around how it is employed.

While the technology is used by police agencies to identify suspects, many agencies try to keep its use secret and few have any disclosure policies to the public – or to suspects. It is often difficult for defense attorneys to find out how facial recognition technologies are used, and because it is almost always shielded as evidence in court, it has not been subject to judicial rulings or proceedings. Suspects and defense attorneys are often unaware that facial recognition is used by police in individual investigations.

The FBI routinely runs facial scans and, according The Perpetual Line-Up housed at Georgetown University, sixteen or more states allow the FBI to use facial recognition technology to compare photos with drivers license and ID photos. They say at least one in four state or local agencies run searches directly or through a partner agency. The Perpetual Line-Up estimates that some 117 million Americans are included in current law enforcement face recognition networks.  Unfortunately they also find that agencies do little to ensure the systems are accurate and do not have procedures in place to ensure proper implementation of the systems.

Amazon’s facial recognition system, called Rekognition, is sold to police departments and taps into body cameras and other city systems. Amazon also sells the technology to a broad range of private sector companies to identify objects and well as people. Rekognition has made news headlines in recent months for its inaccuracy rates, especially for racial and gender disparities, misidentifying women and people with dark skin at far greater rates than white males.  Amazon advises law enforcement to use a “confidence threshold” score of 99% or more to make decisions – and that a human should confirm the software’s prediction to ensure civil rights are not violated.  However in a Washington Post investigative report officers don’t see the confidence metric when they use the system.

Defense attorneys like me are concerned that police are increasingly relying on technology that is error-prone, and is based on potentially faulty algorithms to bring criminal charges that can impact people’s lives so dramatically. This unproven technology and the in-the-shadows practices around it by law enforcement hold big risks for wrongful arrests, convictions or criminal records. There currently is no process for suspects, attorneys, and courts to know that facial recognition has been a factor in identifying a suspect. Furthermore, once a person has been identified – or predicted – by software, an investigation is at risk of become about finding evidence to support a flawed facial prediction.

Congress has begun hearings in the House Committee on Oversight and Reform where the lack of regulation is concerning both sides of the aisle. As one representative said, no elected official has given approvals for governments or agencies to use this technology, even though it affects citizens’ First and Fourth Amendment rights and due process rights.

So, it seems right that we take time to look at restrictions on facial recognition use in law enforcement – and elsewhere – , even as law enforcement lobbies for the benefits of facial recognition in fighting crime and aggressively moves to employ this technology.

David A. Stein is a skilled criminal defense attorney with a track record of obtaining very successful outcomes for his clients. If you have been accused of a crime or need help with a any criminal matter, contact our law offices today at (949) 445-0040 for a consultation. or contact us online by email here.

Menu