Content
@
0 reply
0 recast
0 reaction
chompk ↑
@chompk
Imho, many LLMs are facing with hallucinations and it is indeed an unsolved problem that hinders LLM capabilities towards AGI Just curious about you guys thought on - Will we solve hallucination by 5 years span? - Do you think hallucinations is caused by AR-LLM design?
1 reply
0 recast
0 reaction
chompk ↑
@chompk
My take is that - this problem will persist for more than 5 years On the second take, I have two opinions - AR-LLM can mitigate hallucination my sophisticated decoding method (meta decoding, etc.). I assume that hallucination is caused by non-optimal decoded sequence - AR-LLM is flawed as Yann said
0 reply
0 recast
0 reaction