Content pfp
Content
@
0 reply
0 recast
0 reaction

Varun Srinivasan pfp
Varun Srinivasan
@v
Is there a credible solution to the LLM hallucination problem? Any interesting research papers or discussions on this?
9 replies
2 recasts
45 reactions

Stephan pfp
Stephan
@stephancill
How prevalent is it in coding applications? Feels like I never experience LLM hallucination but coding is pretty much my only use case
1 reply
0 recast
0 reaction

typeof.eth 🔵 pfp
typeof.eth 🔵
@typeof.eth
I get a lot of hallucinations for Solidity specifically, and a the odd hallucination with newer libraries
2 replies
0 recast
1 reaction

Stephan pfp
Stephan
@stephancill
Actually you’re right, I also experience this. I think I’ve got a pretty decent idea of what the AI probably wouldn’t be good at these days which I just avoid asking it about in the first place. Recency bias in my original reply
0 reply
0 recast
2 reactions