Skip to Main Content
  1. Baylor University Libraries
  2. Ask Us!

Q. What is hallucination (in models like ChatGPT)?

Hallucination is the word used to describe the situation when models like ChatGPT output false information as if it were true. Even though the AI may sound very confident, sometimes the answers it gives are just plain wrong. 

Why does this happen? AI tools like ChatGPT are trained to predict what words should come next in the conversation you are having with it. They are really good at putting together sentences that sound plausible and realistic.

However, these AI models don't understand the meaning behind the words. They lack the logical reasoning to tell if what they are saying actually makes sense or is factually correct. They were never designed to be search engines. Instead they might be thought of as “wordsmiths”—tools for summarizing, outlining, brainstorming, and the like.

So we can't blindly trust that everything they say is accurate, even if it sounds convincing. It's always a good idea to double check important information against other reliable sources.

Here’s a tip: Models that are grounded in an external source of information (like web search results) hallucinate less often. That’s because the model searches for relevant web pages, summarizes the results, and links to the pages that each part of the answer came from. This makes it easier to fact-check the result.

Examples of grounded models are Microsoft Copilot, Perplexity, and ChatGPT Plus (the paid version). 

Learn more

  • How can I fact check the information that ChatGPT and other language models give me?
  • I can’t find the citations that ChatGPT gave me. What should I do?
  • Which AI tool for your task?
  • Browse:

  • All
  • Groups
    • HelpDesk+
    • University Libraries
  • Topics
    • 15AI
    • 3BEARdocs
    • 8Beyond the Library
    • 2Browzine
    • 16Circulation
    • 4Citation
    • 1Contact Us
    • 2Copyright
    • 1COVID-19
    • 4Database Passwords
    • 16Databases
    • 15Directions
    • 2Ebooks
    • 4FAQ
    • 3Finding Journals/Articles
    • 8For Faculty
    • 1HathiTrust
    • 2Hours
    • 1Info To Go
    • 12Interlibrary Loan Borrowing
    • 7Interlibrary Loan Lending
    • 22Interlibrary Loan/OsoFast
    • 10Library Technology
    • 11Off-campus Access
    • 9OneSearch
    • 1Other
    • 29Policies
    • 5Research Help
    • 3Scholarly Communication
    • 12Technical Help
    • 2Visitors
    • 4Zotero

Answered By: Ellen Hampton Filgo
Last Updated: Jun 07, 2024     Views: 220

FAQ Actions
  • Print
  • Tweet
  • Share on Facebook

Was this helpful? 0 0

Comments (0)

Add a public comment to this FAQ Entry

Related Topics

  • AI

Contact Us

Email Us
Email Us more info
Find Your Librarian
Find Your Librarian
Make an Appointment
Make an Appointment
Visit Us at Moody Library
Visit Us at Moody Library
(254) 710-6702 (voice)
(254) 710-6702 (voice)
(254) 831-4115 (text)
(254) 831-4115 (text) more info
Powered by Springshare; All rights reserved.
Report a tech support issue.
Login to LibApps

University Libraries

One Bear Place #97148
Waco, TX 76798-7148

(254) 710-6702

  Ask a Question

Osofast A-Z Databases OneSearch Profiles Library Account Login Hours
Baylor University

Copyright © Baylor® University. All rights reserved.

Report It | Title IX | Mental Health Resources | Anonymous Reporting | Legal Disclosures