ChatGPT, Bard, GPT-4, and the like are often pitched as ways to retrieve information. The problem is they'll "retrieve" whatever you ask for, whether or not it exists.
Tumblr user @indigofoxpaws sent me a few screenshots where they'd asked ChatGPT for an explanation of the nonexistent "Linoleum harvest" Tumblr meme,
I didn’t mean to argue against the usefulness of LLMs entirely, they absolutely have their place. I was moreso referring to how everyone and their dog are making AI assistants for tasks that need accurate data without addressing how easy it is for them to present you bad data with total confidence.