AC forced to pay man who received incorrect information from its chatbot
#1
Original Poster
A FlyerTalk Posting Legend
Join Date: Sep 2012
Location: SFO
Programs: AC SE MM, BA Gold, SQ Silver, Bonvoy Tit LTG, Hyatt Glob, HH Diamond
Posts: 45,193
AC forced to pay man who received incorrect information from its chatbot
https://bc.ctvnews.ca/air-canada-s-c...take-1.6769454
My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canadas webpage is accurate, and another is not.
My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canadas webpage is accurate, and another is not.
#2
Join Date: Oct 2015
Location: Canada
Programs: Aeroplan E50/MM, HH gold, Nat Exec Elite, Kimpton Karma
Posts: 2,467
https://bc.ctvnews.ca/air-canada-s-c...take-1.6769454
My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canadas webpage is accurate, and another is not.
My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canadas webpage is accurate, and another is not.
This priceless. Thanks for sharing.
My recollection from the one time I used a bereavement fare was they required a copy of the death certificate. Not available prior to travel since the funeral was within 24 hrs of death.
#3
Join Date: Oct 2022
Location: Ottawa, Canada + Edinburgh, Scotland
Programs: AC SE, Star Alliance Gold
Posts: 992
https://bc.ctvnews.ca/air-canada-s-c...take-1.6769454
My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.
My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.
Last edited by flyingcrooked; Feb 15, 2024 at 7:16 am
#4
Join Date: Jan 2010
Location: YYZ
Programs: AC SE 100K MM; Marriott Lifetime Titanium, Avis Presidents Club
Posts: 1,587
https://bc.ctvnews.ca/air-canada-s-c...take-1.6769454
My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canadas webpage is accurate, and another is not.
My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canadas webpage is accurate, and another is not.
In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.
It's not us, it's our chatbot, you should be suing it!
Priceless!
#5
FlyerTalk Evangelist
Join Date: Jan 2002
Location: Canada
Programs: UA*1K MM SK EBG LATAM BL AC*E50
Posts: 23,515
That is a remarkable story. One has to wonder how clueless and morally bankrupt an organization is to spends months arguing with this guy, rather than just acknowledging the obvious error on its part and keeping its word to him. I wonder if the good people working for AC get worn down by this, like doesnt the human being who sent him all the go pound sand emails not see the guy is right and despair at having to perform the grim task of gaslighting him for months?
He should've sued the bot not AC. Clearly.
#7
Join Date: Sep 2011
Location: Ideally YOW, but probably not
Programs: AC SE*MM
Posts: 1,923
Yet again AC lawyers are reaching conclusions the technology doesn't support, and punishing customers for it. Of course the chatbot some third party tool ("separate legal entity") that AC buys, but it is trained on text that AC provides to that chatbot company. Probably some out of date process stuff that lead it to provide incorrect information. I can't imagine sitting there with a straight face and claiming that this was not ACs responsibility. Brutal.
#8
Join Date: Jan 2017
Location: Halifax
Programs: AC SE100K, Marriott Lifetime Platinum Elite. NEXUS
Posts: 4,702
I have no idea what a "Civil Resolution Tribunal" is, but it sounds like small claims court. Here, it would be an adjudicator, not a judge. And not allowed to make punitive judgements.
#9
Join Date: Oct 2013
Location: YOW
Programs: AC SE, FOTSG Platinum
Posts: 5,895
That is a remarkable story. One has to wonder how clueless and morally bankrupt an organization is to spend months arguing with this guy, rather than just acknowledging the obvious error on its part and keeping its word to him. I wonder if the good people working for AC get worn down by this, like doesn’t the human being who sent him all the “go pound sand” emails not see the guy is right and despair at having to perform the grim task of gaslighting him for months?
One wonders how much they spent trying to avoid paying Mr. Moffatt $650.
On the other hand, if AC wants to stick to the idea that customers are stuck with whatever its chatbots say, I'm sure we can find the airfare equivalent of a $1 truck sooner or later...
#10
Join Date: Jan 2019
Posts: 211
#11
Join Date: Aug 2010
Location: Formerly Box 350, Boston Mass, Oh two one three four. Now near Beverly Hills 90210
Programs: Loyal Order of Water Buffalos
Posts: 4,108
Considering the cost to AC, who thinks there is ANY chance they will update the 'Bot? I expect they have made tens of thousands of dollars (even CAD can add up) from folks who accepted the "fooled ya!" reply. And will continue to do so.
#12
Join Date: Aug 2013
Location: YVR - MILLS Waypoint (It's the third house on the left)
Programs: AC*SE100K, wood level status in various other programs
Posts: 6,436
Air Canada was trying to play a clever game here.
Effectively, they asserted that the chatbot was a acting as an Agent and that Agent lacked authority. In basic contract law, an agent who purports to make a contract on behalf of a principal, but who in fact has no authority to do so, is liable to the other party (in this case Mr. Moffat).
For a variety of reasons, the BC CRT adjudicator was having none of that and said that Air Canada had not exercised reasonable care to ensure the information provided by the chatbot was accurate, despite also serving up a link to the actual policy. Sloppiness and laziness on AC's part, which will no doubt be mitigated by some kind of an opt-in disclaimer pop-up in the future. It's interesting that this laziness and sloppiness extended to the hearing itself, as noted by this gem in section 32 of the order:
But expect to see more crap just like this in the future as more and more entities take the easy way out and abdicate responsibility to ai-based tools for 'customer service'.
Effectively, they asserted that the chatbot was a acting as an Agent and that Agent lacked authority. In basic contract law, an agent who purports to make a contract on behalf of a principal, but who in fact has no authority to do so, is liable to the other party (in this case Mr. Moffat).
For a variety of reasons, the BC CRT adjudicator was having none of that and said that Air Canada had not exercised reasonable care to ensure the information provided by the chatbot was accurate, despite also serving up a link to the actual policy. Sloppiness and laziness on AC's part, which will no doubt be mitigated by some kind of an opt-in disclaimer pop-up in the future. It's interesting that this laziness and sloppiness extended to the hearing itself, as noted by this gem in section 32 of the order:
Air Canada is a sophisticated litigant that should know it is not enough in a legal process to assert that a contract says something without actually providing the contract.
#13
Join Date: Apr 2016
Posts: 653
I'm sorry, I did not understand your message. Please try rephrasing your message. Short sentences work best while I'm still learning.
#14
Moderator, Air Canada; FlyerTalk Evangelist
Join Date: Feb 2015
Location: YYC
Programs: AC SE MM, FB Plat, WS Plat, BA Silver, DL GM, Marriott Plat, Hilton Gold, Accor Silver
Posts: 17,244
#15
Join Date: Jul 2022
Posts: 198
Air Canada was trying to play a clever game here.
Effectively, they asserted that the chatbot was a acting as an Agent and that Agent lacked authority. In basic contract law, an agent who purports to make a contract on behalf of a principal, but who in fact has no authority to do so, is liable to the other party (in this case Mr. Moffat).
For a variety of reasons, the BC CRT adjudicator was having none of that and said that Air Canada had not exercised reasonable care to ensure the information provided by the chatbot was accurate, despite also serving up a link to the actual policy. Sloppiness and laziness on AC's part, which will no doubt be mitigated by some kind of an opt-in disclaimer pop-up in the future. It's interesting that this laziness and sloppiness extended to the hearing itself, as noted by this gem in section 32 of the order:
But expect to see more crap just like this in the future as more and more entities take the easy way out and abdicate responsibility to ai-based tools for 'customer service'.
Effectively, they asserted that the chatbot was a acting as an Agent and that Agent lacked authority. In basic contract law, an agent who purports to make a contract on behalf of a principal, but who in fact has no authority to do so, is liable to the other party (in this case Mr. Moffat).
For a variety of reasons, the BC CRT adjudicator was having none of that and said that Air Canada had not exercised reasonable care to ensure the information provided by the chatbot was accurate, despite also serving up a link to the actual policy. Sloppiness and laziness on AC's part, which will no doubt be mitigated by some kind of an opt-in disclaimer pop-up in the future. It's interesting that this laziness and sloppiness extended to the hearing itself, as noted by this gem in section 32 of the order:
But expect to see more crap just like this in the future as more and more entities take the easy way out and abdicate responsibility to ai-based tools for 'customer service'.