时间:2025-09-16 18:43:50 来源:网络整理编辑:時尚
Air Canada's argument that its AI-powered customer chatbot was solely liable for its own actions did
Air Canada's argument that its AI-powered customer chatbot was solely liable for its own actions didn't hold up in civil court (thank goodness), and now the airline must refund a customer who was given the incorrect information about being comped for his airfare.
The 2022 incident involved one Air Canada customer, Jake Moffatt, and the airline's chatbot, which Moffatt used to get information on how to qualify for bereavement fare for a last-minute trip to attend a funeral. The chatbot explained that Moffat could retroactively apply for a refund of the difference between a regular ticket cost and a bereavement fare cost, as long as it was within 90 days of purchase.
SEE ALSO:Reddit has reportedly signed over its content to train AI modelsBut that's not the airline's policy at all. According to Air Canada's website:
Air Canada’s bereavement travel policy offers an option for our customers who need to travel because of the imminent death or death of an immediate family member. Please be aware that our Bereavement policy does not allow refunds for travel that has already happened.
When Air Canada refused to issue the reimbursement because of the misinformation mishap, Moffat took them to court. Air Canada's argument against the refund included claims that they were not responsible for the "misleading words" of its chatbot. Air Canada also argued that the chatbot was a "separate legal entity" that should be help responsible for its own actions, claiming the airline is also not responsible for information given by "agents, servants or representatives — including a chatbot." Whatever that means.
"While a chatbot has an interactive component, it is still just a part of Air Canada’s website," responded a Canadian tribunal member. "It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."
The first case of its kind, the decision in a Canadian court may have down-the-road implications for other companies adding AI or machine-learning powered "agents" to their customer service offerings.
TopicsArtificial Intelligence
Pokémon Go is so big that it has its own VR porn parody now2025-09-16 18:15
Justin Theroux gave Jennifer Aniston an empty piñata because being famous is terrible2025-09-16 18:10
The best Chandler Bing one2025-09-16 17:45
3 reasons to worry about the FBI's facial recognition program2025-09-16 17:17
Early Apple2025-09-16 17:10
'Ghost Recon: Wildlands' review: Ambitious but poorly executed2025-09-16 17:10
Owner gives adorable 32025-09-16 17:08
'Stranger Things' star Millie Bobby Brown is taking a much2025-09-16 16:29
Did our grandparents have the best beauty advice?2025-09-16 16:18
Fans are obsessing over this image of Hailey Baldwin sitting on the toilet2025-09-16 16:10
'Rocket League' Championship Series Season 2 offers $250,000 prize pool2025-09-16 18:15
Dating app figures out what we all hate about sex2025-09-16 18:09
BioWare finally addresses its questionable handling of a trans character in 'Mass Effect'2025-09-16 18:06
Art supply sales are up, because protest signs don't make themselves2025-09-16 17:37
Man stumbles upon his phone background in real life2025-09-16 17:04
Why video game cameras seem to always suck2025-09-16 17:02
The new emoji coming this year include broccoli, a T. rex and this oddly sexy insect2025-09-16 17:02
The only fictional character in 'Feud' is also the most painfully real2025-09-16 16:52
Singapore gets world's first driverless taxis2025-09-16 16:33
DJ Khaled is taking his social media Stories game to the next level2025-09-16 16:05