12 result(s)
-
1.
Howse v. Coulton - 2026 BCCRT 153 - 2026-01-29
Small Claims Decisions - Final DecisionI find it likely this was a “hallucination”, where artificial intelligence generated false or misleading results.
-
2.
Greenwood v. The Owners, Strata Plan LMS4102 - 2026 BCCRT 6 - 2026-01-05
Strata Property Decisions - Final DecisionI find these are likely “hallucinations”, meaning false or misleading results generated by artificial intelligence.
-
3.
Obermann v. ICBC - 2025 BCCRT 1759 - 2025-12-19
Small Claims Decisions - Final DecisionI find the most likely explanation is that they are “hallucinations” generated by artificial intelligence. In AQ v. BW, 2025 BCCRT 907 at paragraph 16, a CRT vice chair found that CRT’s obligation to provide sufficient reasons did not require it to address arguments with no basis in law.
-
4.
Heneghan v. The Owners, Strata Plan 187 - 2025 BCCRT 1681 - 2025-12-02
Strata Property Decisions - Final DecisionI find it likely that these are “hallucinations”, meaning false results generated by artificial intelligence. 33. Finally, Ms. Heneghan says the strata breached section 20(4) of the SPA’s Standard Bylaws.
-
5.
Obermann v. Spring Financial Inc. - 2025 BCCRT 1669 - 2025-11-28
Small Claims Decisions - Final DecisionI find it likely that these are “hallucinations” where artificial intelligence generates false or misleading results.
-
6.
Lockwood v. ICBC - 2025 BCCRT 1227 - 2025-09-03
Accident Benefits - Final Decision40. Ms. Lockwood says pharmaceutical interventions for her anxiety and depression caused significant adverse effects, such as hallucinations and worsening psychological symptoms. She says that as a result, she pursued naturopathic treatment focused on stress reduction. [...] Ms. Lockwood responded by citing non-existent or inapplicable sections of regulations, likely a result of relying on artificial intelligence, as she did in her confusing reply submissions in this dispute.
-
7.
Maxwell v. WestJet Airlines Ltd. - 2025 BCCRT 1146 - 2025-08-15
Small Claims Decisions - Final DecisionI find it likely that the chatbot “hallucinated” this case. The BC Supreme Court and the CRT have discussed the risks of relying on generative artificial intelligence tools, as the output is often inaccurate.
-
8.
Hakemi v. ICBC - 2025 BCCRT 1035 - 2025-07-24
Accident Benefits - Final DecisionI infer these are hallucinations created by artificial intelligence. 13. Mr. Hakemi brought this dispute as an accident benefits claim under CRTA section 133.
-
9.
Blaser v. Campbell - 2025 BCCRT 962 - 2025-07-15
Small Claims Decisions - Final DecisionI find it likely they are “hallucinations”, where artificial intelligence generates false or misleading results. [...] The courts, and previous CRT decisions, have discussed the inherent risk of relying on unregulated generative artificial intelligence tools, and caution that using such tools is not a substitute for professional advice (see, for example, Zhang v. Chen, 2024 BCSC 285, Floryan v. Luke et al., 2023 ONSC 5108, and Yang v. Gibbs
-
10.
AQ v. BW - 2025 BCCRT 907 - 2025-07-04
Intimate Images - Final Decision - Under Judicial ReviewOther issues the parties raised had no legal basis at all and appear to have been conjured by artificial intelligence. An example is that both parties rely on a hallucinated version of CRTA section 92. [...] I do not consider that obligation to include responding to arguments concocted by artificial intelligence that have no basis in law. I accept that artificial intelligence can be a useful tool to help people find the right language to present their arguments, if used properly. [...] However, people who blindly use artificial intelligence often end up bombarding the CRT with endless legal arguments.
-
11.
Ng v. ICBC - 2025 BCCRT 708 - 2025-05-28
Accident Benefits - Final Decision37. In any event, generative artificial intelligence, such as ChatGPT, is not so intrinsically reliable that I am prepared to accept it as evidence. [...] For example, in one recent CRT decision, a party submitted conversations they had with Microsoft CoPilot, a program similar to ChatGPT, that seemingly “hallucinated” cases that did not exist.[2] While the applicant here is relying on ChatGPT for a medical definition instead of a legal precedent, I have the same underlying
-
12.
Geismayr v. The Owners, Strata Plan KAS 1970 - 2025 BCCRT 217 - 2025-02-14
Strata Property Decisions - Final DecisionThe Geismayrs listed the source of these cases as a “Conversation with Copilot” which is an artificial intelligence chatbot. I find it likely that these cases are “hallucinations” where artificial intelligence generates false or misleading results.