Originally Posted by
bobbytables
ChatGPT will generally just hallucinate flight options if you ask it to make a search, or it will give the options it can find in Google web search results.
Far less true than a year ago, and improving every day. You can also specifically tell the AI "Do not hallucinate and do not make information up" in your prompt, to further lower the instances of this phenomenon.
Most people don't realize just how quickly ChatGPt, Claude, and others are evolving.