As Cities Ban Face Recognition, Body-Cam Firm Axon Also Nixes It

This week’s decision by police equipment manufacturer Axon to forego using facial recognition software in its body cameras follows a string of other actions against the use of face-matching technology by public agencies. They include a pioneering ban in San Francisco last month, and another passed Thursday by the city council of Somerville, MA.

What makes Axon’s new limit to law enforcement’s access to facial recognition tools unusual is that the decision was made by a private technology company whose paying customers are primarily police departments.

Over the past few years, Congress and local government bodies have been mulling the potential harms to society from certain technologies, such as the lack of data privacy protections in an era of social media and proliferating connected devices and cameras. Facial recognition is the latest technology to come under the microscope, fueled by fears that mass surveillance will lead to a police state where government intrudes into the daily lives of innocent citizens, and mistakenly fingers minority members as criminals.

The creators of tech products have rarely taken the lead to guard against such unintended consequences of their inventions before they put them on the market. Amazon, for example, has been making a case that its Facial Rekognition product is being appropriately used for benign purposes, such as fighting human trafficking. The company has been resisting shareholders’ moves to limit the sale of the product to public agencies, and their demands that Amazon ask an independent commission to evaluate its impact on civil rights, as The New York Times reported.

Scottsdale, AZ-based Axon decided to seek out that kind of guidance.

Axon, formerly named Taser International, has drawn scrutiny from civil rights groups who fear that its body cameras and other police IT tools can be used to worsen discriminatory law enforcement practices. The company apparently remained open to hearing such concerns as it planned its future product line-up. Last year, Axon set up an outside AI and Policing Technology Ethics Board to weigh in on its potential use of facial recognition capabilities and other artificial intelligence functions in new product offerings.

That ethics board concluded in a report this month that the use of facial recognition software with body-worn police cameras was a “a complete non-starter.’’ It found that the technology isn’t yet reliable enough to identify individuals, and is particularly inaccurate when it comes to women, the young, and people of non-white ethnicities. These disparities “would only perpetuate or exacerbate the racial inequities that cut across the criminal justice system,’’ the board wrote.

The technology’s failings could be worse under law enforcement conditions such as a police pursuit—with shaking body cameras and poor lighting, according to the board report. Such failings might be relatively unimportant if the purpose is to find a missing elder in a crowd, the board says. But real-time face recognition to flag a suspect could “prime officers to perceive individuals as more dangerous than they really are and to use more force than the situation requires.’’

Axon announced Thursday it would accept the board’s conclusions, and would not equip its body cameras with face-matching technology for the time being. (Face-matching is the process of matching a face to a specific individual’s image in a set of mug shots, or another database.)

“Current face matching technology raises serious ethical concerns. In addition, there are technological limitations to using this technology on body cameras,’’ Axon CEO Rick Smith says in a company blogpost announcing the decision. But Axon will work with the ethics board as it continues research “to better understand and solve for the key issues identified in the (board) report, including evaluating ways to de-bias algorithms as the board recommends,’’ Smith writes.

The company didn’t rule out the use of facial recognition with its body cameras in the future. Its ethics board also plans to reconsider the issue as the technology improves. Axon currently uses face detection software to scan police videos for the faces of bystanders so it can blur them out to protect their privacy before the videos are released to the public.

Axon’s eleven-member ethics board wasn’t fully independent—Axon picked the first members, and paid each member as much as $15,000 a year plus travel expenses to attend meetings, where Smith and other Axon executives also participated in the discussions, according to the board’s report.

Even so, the board, which was made up of experts in fields such as AI, law enforcement, privacy, public policy, and civil rights enforcement, says it felt free to push back on some of the early assumptions the company brought to the table.

Axon originally held that “it could not (and should not) dictate customer policies, nor patrol misuse of its products,’’ once they were in the hands of police agencies and other buyers, according to the ethics board report.

But the ethics board is urging Axon and other companies to design body cameras so they can only be used for legitimate purposes, and equip them to block buyers from customizing the equipment to achieve illicit goals—such as tweaking algorithms to make facial recognition results more biased.

As a global company seeking new markets, Axon should consider how its products could be used under a variety of governmental regimes and legal systems, the board advised the company.

Axon hasn’t announced any commitments on that score. But so far, the board says, “These conversations have been very productive. ‘’

The members of Axon’s ethics board include Barry Friedman, director of the Policing Project at New York University School of Law; Chief Vera Bumpers of the Houston Metro Police Department; Stanford privacy researcher Tracy Ann Kosa, an adjunct faculty member at the Seattle University Faculty of Law; community organizer Mecole Jordan, who has focused on police reform and racial equity; former Seattle police chief and lawyer Kathleen O’Toole; Christy Lopez, co-leader of Georgetown University’s Program on Innovative Policing; and University of Oxford research fellow Miles Brundage, lead author of the recent report, “The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation.”

Axon’s ethics board says its review process is ongoing as the company

Author: Bernadette Tansey

Bernadette Tansey is a former editor of Xconomy San Francisco. She has covered information technology, biotechnology, business, law, environment, and government as a Bay area journalist. She has written about edtech, mobile apps, social media startups, and life sciences companies for Xconomy, and tracked the adoption of Web tools by small businesses for CNBC. She was a biotechnology reporter for the business section of the San Francisco Chronicle, where she also wrote about software developers and early commercial companies in nanotechnology and synthetic biology.