Sector 6 | The Newsletter of AIM

Sector 6 | The Newsletter of AIM

Share this post

Sector 6 | The Newsletter of AIM
Sector 6 | The Newsletter of AIM
Hallucinations Acche Hain
The Belamy

Hallucinations Acche Hain

Analytics India Magazine's avatar
Analytics India Magazine
Sep 18, 2023
∙ Paid
1

Share this post

Sector 6 | The Newsletter of AIM
Sector 6 | The Newsletter of AIM
Hallucinations Acche Hain
Share

Three reasons why we shouldn’t use ChatGPT, Bard or any other LLM-based chatbots: hallucinations, hallucinations, and hallucinations. And one of the many compelling reasons to use these platforms — hallucinations. Interestingly, trust is the only factor that makes hallucinations risky, otherwise we’d say, hallucinations acche hain as they make the models more creative.

Sebastian Berns, a doctoral researcher at Queen Mary University of London, is a big proponent of this crooked feature of chatbots which others abhor. He likes to use these chatbots because they hallucinate, turning them into valuable “co-creative partners”.

Keep reading with a 7-day free trial

Subscribe to Sector 6 | The Newsletter of AIM to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Analytics India Magazine
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share