Air Canada Chatbot Hallucination (2024)
Post-Incident Forensic Analysis
Executive Brief
A customer was misled by an AI chatbot regarding bereavement fare policies. The tribunal found the corporation liable for the chatbot's "hallucinated" policy, rejecting the defense that the chatbot was a "separate legal entity."
Layered Failure Timeline
Design (Structural)
The system allowed the chatbot to generate policy statements without cross-referencing a locked, authoritative knowledge base.
Test (Operational)
Failure to stress-test the chatbot's response to specific, high-stakes policy inquiries like bereavement fares.
Runtime / Oversight (Control)
Lack of real-time monitoring or a "human-in-the-loop" for financial or policy-related claims made by the agent.
Technical Autopsy
The incident revealed a "decoupled accountability" pattern where technical implementation was treated as legally distinct from corporate policy.