LOGOS Bound
Reason Binding Structures

LOGOS Bound Reason Binding StructuresLOGOS Bound Reason Binding StructuresLOGOS Bound Reason Binding Structures

LOGOS Bound
Reason Binding Structures

LOGOS Bound Reason Binding StructuresLOGOS Bound Reason Binding StructuresLOGOS Bound Reason Binding Structures
  • Home
  • Forthcoming Research
  • Police and AI
  • Children's Social Care

When AI Hallucinations Become Public Decisions

Our analysis of the West Midlands Police using AI to make a operational decision

Executive Summary

On 6 November 2025, West Midlands Police banned travelling supporters from a Europa League fixture on the basis of intelligence that included a fabricated football match generated by Microsoft Copilot. The Chief Constable subsequently denied AI involvement twice before Parliament, before admitting the error. He resigned. The IOPC is investigating.¹

This paper argues that the failure was not technological but architectural — a failure of governance, not of AI. Five structural failures are identified: no verification protocol, no provenance marking, no competence framework, no decision audit trail, and no institutional honesty. A seven-principle governance framework is proposed, together with five implementation tools (Appendices A–E).


Key findings:

  • The same AI tool (Microsoft Copilot) is deployed across hundreds of UK public bodies through Microsoft 365 enterprise licences, largely without governance frameworks
  • Existing statutory obligations — PACE 1984, the Equality Act 2010, the Human Rights Act 1998, UK GDPR, and administrative law — already require governance of AI outputs but are not being applied
  • The Bridges v South Wales Police [2020] EWCA Civ 1058 judgment confirms that AI use by public authorities is subject to equality duties
  • The real risk is not spectacular fabrication but quiet, plausible errors accumulating across thousands of decisions affecting millions of people
  • We provide a framework for future governance.


The cost of governance is low. Provenance marking is procedural. Verification is existing professional discipline applied to a new source. Training requires hours, not infrastructure. Audit trails require logging, not system rebuilds. These are governance refinements, not technology investments.


This is not an AI problem. It is a governance gap

The West Midlands Police Copilot Failure analysis

logos_bound_wmp_governance_v7 (pdf)

Download

Copyright © 2026 Logos Bound - All Rights Reserved.

Powered by

  • Home
  • Forthcoming Research
  • Police and AI
  • Children's Social Care

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.

DeclineAccept