TL;DR – Axon Facial Recognition Pilot
- The gist: Axon has deployed live facial recognition on police body cameras for the first time, ending a six-year self-imposed ban with a Canadian pilot.
- Key details: The “Silent Mode” trial involves 50 officers passively scanning faces against a strict watch list of 7,065 individuals flagged for safety risks or warrants.
- The conflict: While Axon frames this as field research, Alberta’s privacy commissioner contends the initiative launched without necessary regulatory approval or completed impact assessments.
- How it works: Officers receive no real-time alerts; matches are reviewed retrospectively at headquarters to preserve human oversight and prevent immediate field confrontations.
Ending a six-year moratorium, Axon Enterprise has deployed facial recognition on police body cameras for the first time in a live operational setting. The pilot, launched this week with Canada’s Edmonton Police Service (EPS), marks a significant shift for the dominant US law enforcement technology provider.
The Scottsdale-based company develops weapons and technology for military, law enforcement, and civilians, most notably the Taser, a line of electroshock devices designed to temporarily incapacitate humans.
Operating in “Silent Mode,” the body cam system scans faces against a watch list of over 7,000 individuals flagged for safety risks or warrants. Yet the rollout has triggered immediate friction, with Alberta’s privacy commissioner asserting the initiative launched without necessary regulatory approval.
Described by Axon as “field research,” this test challenges ethical boundaries the company established in 2019 to prevent real-time mass monitoring.
Promo
From Moratorium to ‘Field Research’: The Strategic Pivot
Marking a definitive end to its self-imposed ban, Axon Enterprise has initiated a live operational test of facial recognition on body-worn cameras.
Confined to a single agency, the Edmonton Police Service (EPS) in Alberta, Canada, the company characterizes the pilot not as a commercial rollout but as “early-stage field research.”
Rick Smith, Axon’s founder and CEO, framed the initiative as a necessary step to validate safety protocols before broader adoption. “This is not a launch. It’s early-stage field research focused on understanding real-world performance, operational considerations, and the safeguards needed for responsible use.”
Operational parameters for the trial are strictly bounded. Beginning December 3 and running through December 31, 2025, 50 specific officers will utilize the technology during their shifts.
Unlike the mass scraping models used by controversial firms like Clearview AI, which have faced legal challenges globally, the system operates on a closed loop. It matches faces only against a predefined “watch list” controlled by the agency.
Edmonton Police Service has disclosed exact figures for this database to preempt concerns about scope creep. Included in the database are 7,065 total individuals, comprising 6,341 people flagged for “safety risks” and 724 with active warrants for serious crimes such as murder or aggravated assault.
The initiative represents a testing ground for Axon’s updated ethical framework. Axon aims to prove that biometric tools can coexist with privacy rights if governed by strict oversight. Defining success for the program involves more than just technical accuracy.
“Success at this stage is not a product; it is proving that the technology can provide real benefits to community safety with safeguards that deliver very low rates of harmful error.”
Smith suggested that international testing is a prerequisite for eventual US deployment.
“By testing in real-world conditions outside the U.S., we can gather independent insights, strengthen oversight frameworks, and apply those learnings to future evaluations, including within the United States.”
Regulatory Collision: Launching Without Approval
Despite Axon’s emphasis on responsibility, the rollout has immediately triggered a conflict with provincial regulators. Diane McLeod, the Information and Privacy Commissioner of Alberta, has confirmed that her office has not yet approved the Privacy Impact Assessment (PIA) submitted by the EPS.
While the police service argues that the pilot is merely a proof-of-concept trial that does not require prior regulatory sign-off, privacy advocates contend this bypasses critical oversight mechanisms.
Compounding the friction is the secrecy surrounding the AI model itself. Axon has admitted to using a third-party vendor for the algorithm but refuses to name the provider.
Barry Friedman, a law professor and former chair of Axon’s AI Ethics Board, criticized this lack of public engagement, saying:
“A pilot is a great idea. But there’s supposed to be transparency, accountability… None of that’s here. They’re just going ahead.”
Central to the EPS defense are the operational mechanics of the trial. Officials argue that because officers receive no real-time data, the risk of immediate confrontation or escalation is nullified. The official Edmonton Police Service announcement outlines the specific operational parameters:
“Starting Wednesday, Dec. 3, 2025, up to 50 police officers who are currently using BWV cameras will begin to use facial recognition-enabled BWV cameras on their shifts for the remainder of the month.”
“This Proof of Concept will test the technology’s ability to work with our database to make officers aware of individuals with safety flags and cautions from previous interactions. It also includes individuals who have outstanding warrants for serious crimes, such as murder, aggravated assault and robbery.”
“Officers will conduct their duties as usual. When these body-worn cameras are actively recording, the facial recognition feature will run automatically in what’s called ‘Silent Mode.’ Officers won’t get any alerts or notifications about facial resemblance while on duty.”
This “Silent Mode” architecture is designed to insulate officers from algorithmic bias during interactions. By withholding match data until a post-shift review, the agency claims it preserves the “human in the loop” requirement essential for ethical policing.
Kurt Martin, Acting Superintendent of the Edmonton Police Service, emphasized the agency’s intent to balance utility with civil liberties.
“We really want to respect individuals’ rights and their privacy interests.”
Inside ‘Silent Mode’: Technical Safeguards and Limitations
Architecturally, the Edmonton pilot differs significantly from the “Live Alert” systems currently deployed in London and other UK jurisdictions. In those environments, officers receive immediate notifications of potential matches, a practice that stricter facial recognition laws often seek to regulate.
In Edmonton, the facial recognition engine runs passively in the background. Officers on the street are not notified of matches, and all “resemblance notifications” are processed retrospectively by a dedicated unit at police headquarters.
Targeting only the most severe threats is a core component of the pilot’s design. Ann-Li Cooke, Axon’s Director of Responsible AI, clarified the scope of the watch list.
“We really want to make sure that it’s targeted so that these are folks with serious offenses.”
This rigorous “human in the loop” requirement is designed to prevent algorithmic errors from leading to wrongful field detentions.
Axon says it engineered the system with a “privacy-first” logic that explicitly prioritizes precision over recall. In technical terms, the algorithm is tuned to minimize false positives – instances where an innocent person is flagged as a suspect – even if this conservative threshold results in missing actual targets. This configuration is a deliberate technical hedge against the civil rights risks often associated with real-time biometric scanning.
Functionally, the system acts as a high-speed filter rather than a dragnet. It is programmed to scan strictly for dangerous offenders or missing persons while ignoring the broader public. Any facial scan that does not generate a high-confidence match against the localized watch list is instantaneously purged from the system, preventing the accumulation of biometric data on citizens not involved in criminal investigations.
Axon also says it committed to a “very low tolerance” for misidentification. Engineers have tuned the system to discard low-confidence matches even if it means missing actual targets, a trade-off intended to protect innocent citizens from false positives.
Data retention policies for the pilot are equally strict. Non-matching biometric data is discarded immediately after processing, though the underlying video footage is retained per standard EPS policy. Superintendent Martin reinforced that technology remains subservient to officer judgment. “This technology will not replace the human component of investigative work.”
Market Realities: Why Axon is Moving Now
The timing of this pilot correlates with significant financial and competitive pressures facing the Scottsdale-based technology giant.
Axon’s stock valuation has suffered a ~30% decline over the trailing month (November-December 2025), creating urgency to demonstrate new revenue avenues amid recent stock volatility.
Competitors are already encroaching on the biometric space. Motorola Solutions faces class-action litigation regarding biometric privacy litigation related to its “FaceSearch” capabilities, despite its public claims of abstaining from proactive identification.
Global normalization is also a factor. With the UK Home Office aggressively expanding live facial recognition, the North American market appears increasingly like an outlier in its resistance. Yet, deploying these tools remains contentious.
Temitope Oriola, a criminology professor at the University of Alberta, described the city’s role in this global experiment. “Edmonton is a laboratory for this tool. It may well turn out to be an improvement, but we do not know that for sure.”
Ultimately, the goal is a re-entry into the United States market. Police demand for biometric tools remains high despite a patchwork of local bans in cities like San Francisco and Boston.
Ultimately, securing public buy-in requires moving beyond theoretical promises to demonstrable results. Legal scholars and ethicists argue that the profound risks associated with biometric surveillance, ranging from privacy erosion to potential bias, are too high to justify deployment without irrefutable evidence of efficacy.
The burden of proof now rests on law enforcement to demonstrate that these tools provide clear, quantifiable safety benefits that significantly outweigh their societal costs.

