Register now for better personalized quote!

Air canada ordered to refund customer after chatbot provides incorrect information

Feb, 19, 2024 Hi-network.com

Air Canada has been ordered by a civil court to refund a customer who received incorrect information from its AI-powered chatbot regarding a refund for his airfare. The incident occurred in 2022 when the customer, Jake Moffatt, used the chatbot to enquire about a bereavement fare for a last-minute trip to attend a funeral.

The chatbot provided Moffatt with incorrect information, stating that he could retroactively apply for a refund within 90 days of purchase. However, Air Canada's website clearly states that their bereavement travel policy does not allow refunds for travel that has already taken place.

When Air Canada refused to issue the reimbursement due to the misinformation provided by the chatbot, Moffatt took the airline to court. Air Canada argued that the chatbot was a separate legal entity and should be held solely responsible for its own actions. They also claimed that they should not be held responsible for information provided by the chatbot or other representatives.

However, the court rejected Air Canada's argument and ruled that the airline is responsible for all the information on its website, regardless of whether it is provided by a chatbot or a static page. The court found that the chatbot, while interactive, is still part of Air Canada's website, and it should be obvious to the airline that they are accountable for all the information presented.

Why does it matter?


This decision in a Canadian court marks the first of its kind and may have implications for other companies using AI or machine-learning-powered customer service agents. It emphasises the responsibility companies have for the accuracy and reliability of the information provided by their AI systems.

tag-icon Hot Tags : Artificial Intelligence Consumer protection

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.