ACLU Cautions: Increasing Adoption of Generative AI by Law Enforcement Endangers Americans’ Civil Rights


The ACLU is expressing concerns regarding the increasing dependence on **artificial intelligence (AI)** by law enforcement agencies throughout the United States.

In a six-page [white paper](https://www.aclu.org/documents/aclu-on-police-departments-use-of-ai-to-draft-police-reports) published on December 10, the largest civil rights organization in the country cautions that the implementation of generative AI technologies—like chatbots and automated writing tools—represents a considerable overreach with serious consequences for civil rights. The report specifically critiques the utilization of **Draft One**, a generative AI tool driven by OpenAI’s GPT-4 model, which assists officers in composing police reports using audio captured by body cameras. Over the last year, numerous police departments have begun testing tools like Draft One, with their use [growing](https://www.zdnet.com/article/police-are-using-ai-to-write-crime-reports-what-could-go-wrong/#ftag=RSSbaffb68), as many cities promote generative AI as a remedy for budgetary and staffing issues.

### The Dangers of AI in Law Enforcement

Detractors contend that incorporating AI into police operations could compromise the integrity of the justice system. Police reports are essential to judicial proceedings, impacting everything from investigations to sentencing. Legal scholars, including Andrew Guthrie Ferguson, have warned that reports generated by AI might dilute the accountability mechanisms inherent in traditional, human-authored reports.

“The act of composing a justification, attesting to its accuracy, and making that record accessible to other legal professionals—such as prosecutors and judges—works as a safeguard against police authority,” Ferguson stated in one of the [initial law reviews](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4897632) scrutinizing AI-generated police reports. This review is referenced in the ACLU’s white paper.

The ACLU highlights four primary concerns regarding AI-generated police reports:

1. **Accountability**: Reports authored by humans require officers to explain their actions and observations, a procedure that AI might circumvent.
2. **Bias and Errors**: Generative AI technologies are susceptible to biases and “hallucinations” (fabricated or incorrect outputs), which could misrepresent the facts of a case.
3. **Transparency**: The unclear nature of AI programs complicates public oversight of how these tools function.
4. **Privacy**: The deployment of AI brings substantial issues surrounding data security and privacy.

The organization contends that depending on AI interpretations instead of an officer’s recollection and subjective observations erodes the fairness of the judicial process.

### A Plea for Caution

The ACLU advises that should AI be utilized in policing, it ought to enhance—not replace—human contributions. For instance, it suggests that AI tools transcribe audio-recorded narratives from officers, which could then be assessed alongside the original recordings. This strategy would maintain the human component while utilizing AI for improved efficiency.

Nonetheless, despite these suggestions, companies specializing in AI continue to heavily invest in police and [military uses](https://mashable.com/article/open-ai-military-technology-pentagon-partnership) of their technologies.

“Police reports are fundamental to our justice system,” the ACLU asserts. “They play a crucial role in establishing innocence, guilt, and punishment, often acting as the sole official record of an occurrence. AI report-writing technology strips away essential human factors from police processes and is too novel, too unproven, too unreliable, too opaque, and too biased to be woven into our criminal justice system.”

As discussions regarding AI’s function in law enforcement escalate, the ACLU’s warning acts as a reminder of the possible dangers associated with favoring technological efficiency at the expense of accountability and fairness.