Paul Berg pfp
Paul Berg
@prberg
True AGI won’t hallucinate like today’s LLMs. Unless, like some humans, it has schizophrenia :)
1 reply
1 recast
2 reactions

Quazia pfp
Quazia
@quazia
I mean... humans are just outright wrong all the time. Look at the accuracy of witness testimony. Human thought is explicitly fallible. It's almost unrealistic to call it hallucinations given that most humans will confidently state something outright wrong as fact given the chance.
1 reply
0 recast
1 reaction