8 Comments

I'm not so sure. LLMs don't have logical reasoning as we understand it, and don't have any grounding in the external world. As such, LLMs are well known to make up plausible nonsense (aka bullshit) at the drop of a hat, and, if you feed it sufficient amounts of nonsense, will generate nonsense back at you.

Expand full comment