Skip to main content

Upcoming revision of the Swiss VAT Law – introduction to the changes for government, public services and non-governmental organisations

The emergence of new technologies, including the widespread use of artificial intelligence in everyday life, has significantly affected the legal landscape. The use of chatbots in online contracting, coupled with the now well-known phenomenon of artificial intelligence hallucinations – incorrect or misleading results derived from AI – has resulted in novel challenges, notably in the realm of contract law. As a result, the courts and tribunals must adapt to this new reality and grapple with issues such as determining liability where a customer contracts on the basis of inaccurate information provided by a chatbot on a company’s website.

On 14 February 2024, the Civil Resolution Tribunal (“CRT”) of British Columbia issued a decision in the Moffatt v. Air Canada case1. In essence, it considered that Air Canada was liable for the misleading information provided by its chatbot to a customer, Mr Moffatt. As a result, Air Canada was required to apply the relevant policy in the way the chatbot presented it to him.

A growing number of companies have begun using chatbots on their websites, with the aim of making the user experience easier and increasing customer engagement. While this is advantageous in many respects, it is important to highlight the possible legal implications. We will briefly look at the above-referenced decision and the findings of the CRT (Section 1). We will then consider how a Swiss tribunal might analyse a similar situation (Section 2) and provide practical recommendations which our clients can consider implementing (Section 3).

1. The decision in Moffatt v. Air Canada2

In November 2022 Mr Moffatt, living in Vancouver, was looking at travel options to attend the funeral of a relative in Toronto. He inquired about Air Canada’s bereavement policy which enables passengers travelling due to the death of an immediate family member to benefit from reduced fares. The Air Canada chatbot informed him that he could book his flight and then submit a reimbursement request within 90 days of the purchase.

Mr Moffatt, relying on the indications provided by the chatbot, booked his flights and applied for a partial refund upon his return to Vancouver. Air Canada refused to process the refund as its bereavement policy does not allow requests to be submitted after travel has been completed.

The CRT considered that Air Canada made a negligent misrepresentation3. As a service provider, Air Canada owed a duty of care to Mr Moffatt, the consumer, under Canadian law4. In this respect Air Canada was obligated to take reasonable care to ensure that their representations were accurate and not misleading5.

One of Air Canada’s notable counterarguments was its suggestion that the chatbot was a separate legal entity, responsible for its own actions6. The CRT noted in response that Air Canada is responsible for all the information on its website, whether it comes from a static page or a chatbot7. It further considered that Air Canada did not take reasonable care to ensure that the chatbot was accurate and sided with Mr Moffatt in that the customer should not be expected to double-check information found in one part of Air Canada’s website against another part of the company’s website8.

In light of these considerations, the CRT awarded damages to Mr Moffatt amounting to the difference between the full price paid by him and the hypothetical bereavement fare9.

2. Swiss law and chatbots

Swiss tribunals may well arrive at the same conclusion as British Columbia’s CRT did. Indeed, while the Swiss Federal Supreme Court has – to the best of our knowledge – not been called to rule on this issue as yet, information provided by chatbots is likely to be considered as directly controlled by the legal entity who set it up, or, in any event, attributable to this legal entity, including when the chatbot’s indications prove inaccurate10.  A caveat would be that the customer should not have deliberately tricked the chatbot into providing  inaccurate information in bad faith11. 

Depending on the facts of the case at hand, a Swiss tribunal may consider that a contract was not validly entered into in the absence of a meeting of the minds. There may likewise be a discussion regarding a valid integration of general conditions. Alternatively, and subject to all conditions being fulfilled, Mr Moffatt, or another customer, as the case may be, might rely on a defect in consent such as a fundamental error.

 

3. Practical implications


As people gravitate towards the path of least resistance, easy access to information via a chatbot on a company website is highly appealing. As a result, given also chatbots' potential in terms of saving resources by reducing the need for human customer support, an increasing number of companies are implementing them.


Nevertheless, in trying to serve their clients, companies must ensure that they do not put themselves in situations that mislead customers and result in costly and time-consuming legal proceedings. Accurate programming and set-up of the tools used is key. This includes identifying and implementing trigger words or questions that necessitate human intervention to avoid generating inaccurate information, as well as regular testing.


In the case of Moffatt v. Air Canada, the ultimate economic damage the defendant was required to pay was rather low. However, given the growing use of chatbots and wide use of online contracting, we may see more severe impacts in the future. Additionally, the risk of non-economic damage, including reputational damage, should not be taken lightly.


From the legal perspective, it is crucial to have appropriate safeguards in place, such as relevant disclaimers and limitation of liability clauses, and to comply with any other relevant legal requirements, including data protection considerations. Finally, companies operating on a global scale will need to have particular regard to private international law rules, including consumer protection legislation, and consider also the appropriate dispute resolution clauses, including online dispute resolution and/or alternative dispute resolution options.

 

If you wish to discuss any of the above, please do not hesitate to get in touch with the team.

 

1Moffatt v. Air Canada, 2024 BCCRT 149.

2The CRT is an online tribunal within the British Columbia public justice system. CRT decisions and orders are enforceable in court per Part 6 of the Civil Resolution Tribunal Act (“CRTA”), but do not create a legal precedent binding on the CRT or other tribunals.
3Moffatt v. Air Canada, para. 24 et seq.
4Ibid., para. 26.
5Ibid. in fine.
6Ibid., para. 27.

7Ibid.
8Ibid., para. 28.
9Ibid., para. 33 et seq.

10Grégoire Geissbühler/Michel José Reymond, Bots, spam et no-reply: une théorie de la réception à l’âge d’internet, ZSR 2022/1, p. 358.
11By way of example, in late 2023, a GM dealer chatbot agreed to sell a Chevy Tahoe worth over USD 75,000 for USD 1 after it was tricked into doing so by a hacker. Based on publicly available information, this case appears to have been a prank and is unlikely to end up in court. However, similar situations may arise in the future and will potentially be dealt differently in different jurisdictions.

Authors

Our thinking