After its chatbot hallucinates and lies, Air Canada is sued and loses the case

admin19 February 2024Last Update :
After its chatbot hallucinates and lies, Air Canada is sued and loses the case

After its chatbot hallucinates and lies, Air Canada is sued and loses the case،

If a chatbot lies to you, it is not its fault, it is the company that “employed” it that is responsible. So give the poor chatbot a bit of a break (but sue its employer and go to court, if necessary)!

A few years ago, a man named Jake Moffatt spoke to Air Canada's chatbot, but got extremely false and frustrating information that cost him time and money. He took the matter to court, and now the decision is in his favor (via Crushable).

The story follows Moffatt who was trying to get information on how to qualify for the bereavement rate for a last minute trip to attend a funeral. The Air Canada chatbot then explained that the customer could retroactively request a refund of the difference between the cost of a regular ticket and the fare of a bereavement ticket, provided it was within 90 days of the purchase.

However, Moffatt later learned that this was completely false and, instead, is the airline's actual policy on its website: “Air Canada's Bereavement Travel Policy provides an option for our customers who must travel due to the imminent death or death of an immediate family member. Please note that our bereavement policy does not allow refunds for trips already taken.

When Air Canada refused to provide the refund promised by its chatbot, Moffatt took it to court. The airline tried to argue that it was not responsible for its chatbot's “misleading comments” and that the chatbot was a “separate legal entity” that should be held responsible for its own actions, claiming that the airline was also not responsible for information provided by “agents, servants or representatives — including a chatbot.”

Good try.

“Although a chatbot has an interactive component, it is only part of the Air Canada website,” responded a member of the Canadian court. “It should be obvious to Air Canada that it is responsible for all information on its website. It doesn’t matter whether the information comes from a static page or a chatbot.”

Does this mean that tricking a given company's chatbot into disclosing false information could result in easy money through a lawsuit? Interesting times ahead.