Because accuracy requires that you make a reasonable distinction between truth and fiction, and that requires context, meaning, understanding. Hell, full humans aren’t that great at this task. This isn’t a small problem, I don’t think you solve it without creating AGI.
Some “AI” LLMs resort to light hallucinations. And then ones like this straight-up gaslight you!
Factual accuracy in LLMs is “an area of active research”, i.e. they haven’t the foggiest how to make them stop spouting nonsense.
duckduckgo figured this out quite a while ago: just fucking summarize wikipedia articles and link to the precise section it lifted text from
We can’t fleece investors with that though, needs more “AI”.
Because accuracy requires that you make a reasonable distinction between truth and fiction, and that requires context, meaning, understanding. Hell, full humans aren’t that great at this task. This isn’t a small problem, I don’t think you solve it without creating AGI.
Ha!
You buy this? (Believe it’s incredibly expensive)
MFer accidentally got “plum” right and didn’t even know it…