That’s something people really have to get into their heads: an “answer” by an LLM ist just a series of high probability tokens. It’s only us humans who interpret reason and value into it. From the system’s standpoint it’s just numbers without any meaning whatsoever. And no amount of massaging will change that. LLMs are about as “intelligent” as a fancy database query.
That’s something people really have to get into their heads: an “answer” by an LLM ist just a series of high probability tokens. It’s only us humans who interpret reason and value into it. From the system’s standpoint it’s just numbers without any meaning whatsoever. And no amount of massaging will change that. LLMs are about as “intelligent” as a fancy database query.