Airline Ordered To Pay Passenger Attending Funeral After Chatbot Lies About Fare

air canada chatbot spews misinformation leading to partial airfare refund
ChatGPT and large language models are incredibly useful for a wide variety of things, but acting as a customer service representative on behalf of a company is not one of those things. Whether the models are selling Chevy Tahoes for $1 or imagining information about a shipping company, this technology is just not built with the potential risks in mind or at least the guardrails necessary for the specific tasks. It would seem Air Canada has just learned this lesson the hard way after a recent ruling via a Canadian Tribunal.

On Valentine’s Day, a Canadian small claims final decision was published on the Civil Resolution Tribunal website regarding the case of Moffatt v. Air Canada. For context, in November 2022, Jake Moffatt lost their grandmother and had to fly to the funeral. With this process, they were shopping around for airfare and leveraged a chatbot on Air Canada’s website, which suggested that Moffatt could “apply for bereavement fares retroactively.” However, they found that was not the case and that Air Canada would not apply retroactive bereavement fares. As such, Moffatt claimed that Air Canada must provide a partial refund of the ticket price of $880, which is the claimed difference in price between the regular and alleged bereavement airfare.

travel air canada chatbot spews misinformation leading to partial airfare refund

Of course, Air Canada explained to Moffatt that the chatbot had misled him with the information and that they could not provide a retroactive airfare refund. Further, Air Canada argued that “it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot.” The Tribunal Member behind the disclosure, Christopher C. Rivers, suggested that “in effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.”

However, this argument does not realistically make sense, and as such, the Tribunal has sided with Moffatt in the case. As such, Air Canada has been ordered to pay Mr. Moffatt a total of $812.02, which includes damages, interest, and fees associated with the travel and the tribunal proceedings. Of course, this also sets a rather interesting precedent regarding chatbots and the information that they provide. Perhaps we will see not only more cases of the same nature, such as people trying to get $1 cars, but also more careful implementations of chatbots and large language models.