DiscoverThe Next WebAI doesn’t hallucinate — why attributing human traits to tech is users’ biggest pitfall
AI doesn’t hallucinate — why attributing human traits to tech is users’ biggest pitfall

AI doesn’t hallucinate — why attributing human traits to tech is users’ biggest pitfall

Update: 2024-09-19
Share

Description


This year, Air Canada lost a lawsuit against a customer who was misled by an AI chatbot into purchasing full-price plane tickets, being assured they would later be refunded under the company’s bereavement policy. The airline tried to claim the bot was “responsible for its own actions.” This line of argumentation was rejected by the court and the company not only had to pay compensation, it also received public criticism for attempting to distance itself from the situation. It’s clear companies are liable for AI models, even when they make mistakes beyond our control. The rapidly advancing world of AI,…

This story continues at The Next Web
Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

AI doesn’t hallucinate — why attributing human traits to tech is users’ biggest pitfall

AI doesn’t hallucinate — why attributing human traits to tech is users’ biggest pitfall

Andrea Hak