DiscoverThe Digital DietFact or Fiction: Debunking the Misinformation in ChatGPT’s Hallucinations
Fact or Fiction: Debunking the Misinformation in ChatGPT’s Hallucinations

Fact or Fiction: Debunking the Misinformation in ChatGPT’s Hallucinations

Update: 2023-12-12
Share

Description

In this episode, I talk about:

  • The phenomenon of AI hallucinations – false or misleading information that is presented as fact
  • How and why AI hallucinations occur with platforms such as ChatGPT
  • 6 hilarious and scary examples of ChatGPT hallucinations, including:
    • Lying about how many people survived the sinking of the Titanic
    • Fabricating scientific references to support the idea that cheese is bad for your health
    • Writing a New York Times opinion piece about why mayonnaise is racist
    • Making up a historical French King
    • Writing a rave review for the ill-fated Fyre Festival
    • Inventing a world record for man walking on water
  • Why you should still fact-check ChatGPT’s responses, despite improvements in the AI’s accuracy

View show notes including links to all the resources and tools mentioned: https://thedigitaldietcoach.com/025


Get your free #TechTimeout Challenge 30 Day Digital Detox Guide

 

Keep in touch with me:

 

----------------------------------------

Music by FASSounds

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Fact or Fiction: Debunking the Misinformation in ChatGPT’s Hallucinations

Fact or Fiction: Debunking the Misinformation in ChatGPT’s Hallucinations

Marisha Pink