Go Back  FlyerTalk Forums > Miles&Points > Airlines and Mileage Programs > Air Canada | Aeroplan
Reload this Page >

AC forced to pay man who received incorrect information from its chatbot

Community
Wiki Posts
Search

AC forced to pay man who received incorrect information from its chatbot

Thread Tools
 
Search this Thread
 
Old Feb 14, 2024, 11:52 pm
  #1  
A FlyerTalk Posting Legend
Original Poster
 
Join Date: Sep 2012
Location: SFO
Programs: AC SE MM, BA Gold, SQ Silver, Bonvoy Tit LTG, Hyatt Glob, HH Diamond
Posts: 44,354
AC forced to pay man who received incorrect information from its chatbot

https://bc.ctvnews.ca/air-canada-s-c...take-1.6769454

My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.
canadiancow is offline  
Old Feb 15, 2024, 12:01 am
  #2  
 
Join Date: Oct 2015
Location: Canada
Programs: Aeroplan E50/MM, HH gold, Nat Exec Elite, Kimpton Karma
Posts: 2,354
Originally Posted by canadiancow
https://bc.ctvnews.ca/air-canada-s-c...take-1.6769454

My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.

This priceless. Thanks for sharing.

My recollection from the one time I used a bereavement fare was they required a copy of the death certificate. Not available prior to travel since the funeral was within 24 hrs of death.
Bartolo is offline  
Old Feb 15, 2024, 12:28 am
  #3  
 
Join Date: Oct 2022
Location: Ottawa, Canada + Edinburgh, Scotland
Programs: AC SE, Star Alliance Gold
Posts: 815
Originally Posted by canadiancow
https://bc.ctvnews.ca/air-canada-s-c...take-1.6769454

My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.
That is a remarkable story. One has to wonder how clueless and morally bankrupt an organization is to spend months arguing with this guy, rather than just acknowledging the obvious error on its part and keeping its word to him. I wonder if the good people working for AC get worn down by this, like doesn’t the human being who sent him all the “go pound sand” emails not see the guy is right and despair at having to perform the grim task of gaslighting him for months?

Last edited by flyingcrooked; Feb 15, 2024 at 7:16 am
flyingcrooked is offline  
Old Feb 15, 2024, 3:53 am
  #4  
 
Join Date: Jan 2010
Location: YYZ
Programs: AC SE 100K MM; Marriott Lifetime Titanium, Avis Presidents Club
Posts: 1,082
Originally Posted by canadiancow
https://bc.ctvnews.ca/air-canada-s-c...take-1.6769454

My favorite quote: There is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.
This one is pretty good too:

​​​​​​In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.

It's not us, it's our chatbot, you should be suing it!

Priceless!
billdokes is offline  
Old Feb 15, 2024, 4:02 am
  #5  
FlyerTalk Evangelist
 
Join Date: Jan 2002
Location: Canada
Programs: UA*1K MM SK EBG LATAM BL
Posts: 23,314
Originally Posted by flyingcrooked
That is a remarkable story. One has to wonder how clueless and morally bankrupt an organization is to spends months arguing with this guy, rather than just acknowledging the obvious error on its part and keeping its word to him. I wonder if the good people working for AC get worn down by this, like doesn’t the human being who sent him all the “go pound sand” emails not see the guy is right and despair at having to perform the grim task of gaslighting him for months?
Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions

He should've sued the bot not AC. Clearly.
rankourabu is offline  
Old Feb 15, 2024, 5:35 am
  #6  
 
Join Date: Mar 2016
Programs: AC SE
Posts: 1,506
Crazy. I wish the judge had added a couple of 0s to the judgment just to punish AC for being so embarrassingly obstinate.
TheCanuckian is offline  
Old Feb 15, 2024, 5:46 am
  #7  
 
Join Date: Sep 2011
Location: Ideally YOW, but probably not
Programs: AC SE*MM
Posts: 1,827
Yet again AC lawyers are reaching conclusions the technology doesn't support, and punishing customers for it. Of course the chatbot some third party tool ("separate legal entity") that AC buys, but it is trained on text that AC provides to that chatbot company. Probably some out of date process stuff that lead it to provide incorrect information. I can't imagine sitting there with a straight face and claiming that this was not ACs responsibility. Brutal.
RatherBeInYOW is offline  
Old Feb 15, 2024, 5:47 am
  #8  
 
Join Date: Jan 2017
Location: Halifax
Programs: AC SE100K, Marriott Lifetime Platinum Elite. NEXUS
Posts: 4,570
Originally Posted by TheCanuckian
Crazy. I wish the judge had added a couple of 0s to the judgment just to punish AC for being so embarrassingly obstinate.
I have no idea what a "Civil Resolution Tribunal" is, but it sounds like small claims court. Here, it would be an adjudicator, not a judge. And not allowed to make punitive judgements.
TheCanuckian likes this.
RangerNS is offline  
Old Feb 15, 2024, 7:21 am
  #9  
 
Join Date: Oct 2013
Location: YOW
Programs: AC SE, FOTSG Platinum
Posts: 5,732
Originally Posted by flyingcrooked
That is a remarkable story. One has to wonder how clueless and morally bankrupt an organization is to spend months arguing with this guy, rather than just acknowledging the obvious error on its part and keeping its word to him. I wonder if the good people working for AC get worn down by this, like doesn’t the human being who sent him all the “go pound sand” emails not see the guy is right and despair at having to perform the grim task of gaslighting him for months?
It's the Disney approach - fight everything, never admit fault - and I agree that it's reprehensible.

One wonders how much they spent trying to avoid paying Mr. Moffatt $650.

On the other hand, if AC wants to stick to the idea that customers are stuck with whatever its chatbots say, I'm sure we can find the airfare equivalent of a $1 truck sooner or later...
m.y, wrp96, kalderlake and 4 others like this.
YOWgary is online now  
Old Feb 15, 2024, 8:35 am
  #10  
 
Join Date: Jan 2019
Posts: 205
Originally Posted by rankourabu
Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions

He should've sued the bot not AC. Clearly.
I'm guessing AC will now be suing the bot to get the money they were forced to pay back.
eqeqeqx is offline  
Old Feb 15, 2024, 10:02 am
  #11  
 
Join Date: Aug 2010
Location: Formerly Box 350, Boston Mass, Oh two one three four. Now near Beverly Hills 90210
Programs: Loyal Order of Water Buffalos
Posts: 3,938
Considering the cost to AC, who thinks there is ANY chance they will update the 'Bot? I expect they have made tens of thousands of dollars (even CAD can add up) from folks who accepted the "fooled ya!" reply. And will continue to do so.
Out of my Element is offline  
Old Feb 15, 2024, 10:05 am
  #12  
 
Join Date: Aug 2013
Location: YVR - MILLS Waypoint (It's the third house on the left)
Programs: AC*SE100K, wood level status in various other programs
Posts: 6,232
Air Canada was trying to play a clever game here.

Effectively, they asserted that the chatbot was a acting as an Agent and that Agent lacked authority. In basic contract law, an agent who purports to make a contract on behalf of a principal, but who in fact has no authority to do so, is liable to the other party (in this case Mr. Moffat).

For a variety of reasons, the BC CRT adjudicator was having none of that and said that Air Canada had not exercised reasonable care to ensure the information provided by the chatbot was accurate, despite also serving up a link to the actual policy. Sloppiness and laziness on AC's part, which will no doubt be mitigated by some kind of an opt-in disclaimer pop-up in the future. It's interesting that this laziness and sloppiness extended to the hearing itself, as noted by this gem in section 32 of the order:

Air Canada is a sophisticated litigant that should know it is not enough in a legal process to assert that a contract says something without actually providing the contract.
But expect to see more crap just like this in the future as more and more entities take the easy way out and abdicate responsibility to ai-based tools for 'customer service'.
Bohemian1 is online now  
Old Feb 15, 2024, 10:37 am
  #13  
 
Join Date: Apr 2016
Posts: 616
Originally Posted by rankourabu
Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions

He should've sued the bot not AC. Clearly.
I'm sorry, I did not understand your message. Please try rephrasing your message. Short sentences work best while I'm still learning.
zappy312 is offline  
Old Feb 15, 2024, 10:58 am
  #14  
Moderator, Air Canada; FlyerTalk Evangelist
 
Join Date: Feb 2015
Location: YYC
Programs: AC SE MM, FB Plat, WS Plat, BA Silver, DL GM, Marriott Plat, Hilton Gold, Accor Silver
Posts: 16,778
Originally Posted by Bohemian1
It's interesting that this laziness and sloppiness extended to the hearing itself, as noted by this gem in section 32 of the order
Perhaps AC had the chatbot handle the fillings for the case...
uanj, ZenFlyer, YOWgary and 2 others like this.
Adam Smith is offline  
Old Feb 15, 2024, 11:25 am
  #15  
 
Join Date: Jul 2022
Posts: 181
Originally Posted by Bohemian1
Air Canada was trying to play a clever game here.

Effectively, they asserted that the chatbot was a acting as an Agent and that Agent lacked authority. In basic contract law, an agent who purports to make a contract on behalf of a principal, but who in fact has no authority to do so, is liable to the other party (in this case Mr. Moffat).

For a variety of reasons, the BC CRT adjudicator was having none of that and said that Air Canada had not exercised reasonable care to ensure the information provided by the chatbot was accurate, despite also serving up a link to the actual policy. Sloppiness and laziness on AC's part, which will no doubt be mitigated by some kind of an opt-in disclaimer pop-up in the future. It's interesting that this laziness and sloppiness extended to the hearing itself, as noted by this gem in section 32 of the order:



But expect to see more crap just like this in the future as more and more entities take the easy way out and abdicate responsibility to ai-based tools for 'customer service'.
A while ago I saw on X some screenshots of an AI chatbot for a car dealership "agreeing" to sell a new vehicle for $1,000 and commenters discussing agent-contract-principal of how a judge would interpret it. This one is an interesting case.
bigdog2 is offline  


Contact Us - Manage Preferences - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service -

This site is owned, operated, and maintained by MH Sub I, LLC dba Internet Brands. Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Designated trademarks are the property of their respective owners.