The summer of 2025 exposed a predictable but preventable failure in how we govern automated license plate readers. A nationwide search of Flock Safety’s network was used by a Texas sheriff to locate a woman after a family reported she had self-administered an abortion. That query reached camera networks in states where abortion remains protected and triggered an Illinois review into whether local police broke state law by sharing plate data across jurisdictions.

This is not an isolated PR problem for a single vendor. It is a systems problem created by design choices, contract structures, and operational defaults that make broad, cross-jurisdictional searches easy and accountability slow. Public reporting and state inquiries show three concrete failures: permissive national lookup features that let agencies query far beyond their borders; insufficient guardrails requiring only a freeform ‘‘reason’’ field rather than verified legal authorization; and weak oversight of how vendors implement or enforce state privacy rules.

Flock’s initial responses matter but do not fix the root causes. The company says it blocked dozens of out-of-state agencies from accessing Illinois data and introduced keyword flagging to block searches that include terms such as abortion and immigration. Flock also issued public clarifications about specific incidents, but those product changes are reactive and brittle when compared with durable policy and technical controls that prevent misuse in the first place.

We need to be precise about the harms. ALPR networks collect location metadata on virtually every vehicle that passes a camera. When those records are searchable across states and searchable without a validated legal basis, the databases can become de facto trackers of people seeking lawful health care or traveling across state lines. That risk is not hypothetical. It was raised by the Illinois inquiry and the public reporting that prompted it.

What local agencies and vendors must do today

1) Immediately revoke or tightly constrain any national lookup feature. If an agency needs out-of-state data it should be required to make a documented, auditable legal request and to route that request through a human-reviewed interagency process. The convenience of a cross-jurisdictional textbox should not replace warrants, mutual-aid agreements, or other lawful processes.

2) Replace freeform reason fields with structured authorization checks. Require searchable reason codes that map to statutes or open warrants. Build automated blocks that stop queries that do not match a stored legal justification. Logs must be immutable, time-stamped, and retained under an independent custodian so auditors can reconstruct who searched what and why.

3) Enforce stronger authentication and vendor responsibility. Vendors should require multi-factor authentication, session timeout defaults, per-user access scopes, and a contractual obligation to lock down or suspend access when audits reveal noncompliance. Technical defaults must favor least privilege.

4) Contractually mandate transparency and audit rights. Municipalities must insist that vendors provide regular, machine-readable audit logs and grant independent researchers or designated state auditors access under confidentiality protections. Contracts should include penalties for improper data sharing and explicit limits on federated or national indexing.

Policy and legislative fixes

States that wish to protect reproductive privacy and civil liberties should codify basic guardrails for ALPRs: retention limits on raw imagery, narrow permitted uses, mandatory public notice, mandated audits, and civil remedies for misuse. Illinois’s effort to audit and block impermissible searches is a start. But statute needs to be ubiquitous and interoperable across states, otherwise law enforcement in restrictive states will simply look elsewhere.

Technical best practices for vendors and integrators

  • Enforce per-network access control so a searcher cannot pivot from one jurisdiction to a nationwide sweep without an auditable, human-reviewed escalation.
  • Implement query intent validation that can refuse searches containing flagged reasons until a verified legal authorization is provided.
  • Publish privacy impact assessments and threat models and invite third parties to test for access control weaknesses.

Why community oversight matters

Vendors and agencies often promise ‘‘responsible use’’ while building features that assume unlimited trust. That trust is misplaced. Independent oversight bodies, community review panels, and civil society audits can catch emergent abuses long before they become crises. Communities that host cameras should have standing to demand logs, redress, and opt-out options for data sharing beyond narrowly defined local public-safety uses.

Final note on tradeoffs

ALPR systems provide clear public-safety benefits when used narrowly: stolen car recovery, AMBER alerts, and targeted suspect tracking based on validated probable cause. Those uses are worth supporting, but the operational defaults must reflect the privacy costs. Design choices matter: make broad searchability an opt-in governed by higher legal bars rather than the default. Otherwise we will continue to see abuses that harm the people most vulnerable to surveillance, including people seeking reproductive health care.

If you are a procurement official, vendor, or local advocate reading this, start with three actions: demand machine-readable audit logs, require per-network access controls in contracts, and institute quarterly third-party compliance reviews. These are practical, implementable steps that reduce risk without throwing away legitimate public-safety capabilities.