Originally Posted by
bobbytables
Indeed they are evolving very fast. It's useful to be aware of why hallucinations happen though. Generally the risk is higher for hallucinations when some combination of a lack of relevant input or highly constrained output means there are few non-hallucinated possibilities for generation. In this case it's the former: there is no place for it to go and find the flights you're looking for, since publicly-freely-accessible realtime flight search APIs do not exist. Which is why computer use is the key to enabling this kind of use case.
To be fair, ChatGPT isn't reporting specific flight numbers for the referenced prompt - it's not hallucinating them - it's just reporting general availability and "starting from" pricing. And...Google Flights and Matrix - ITA are freely accessible and real time, yes?