Content pfp
Content
@
0 reply
0 recast
0 reaction

Varun Srinivasan pfp
Varun Srinivasan
@v
Is there a credible solution to the LLM hallucination problem? Any interesting research papers or discussions on this?
9 replies
2 recasts
45 reactions

Stephan pfp
Stephan
@stephancill
How prevalent is it in coding applications? Feels like I never experience LLM hallucination but coding is pretty much my only use case
1 reply
0 recast
0 reaction

typeof.eth πŸ”΅ pfp
typeof.eth πŸ”΅
@typeof.eth
I get a lot of hallucinations for Solidity specifically, and a the odd hallucination with newer libraries
2 replies
0 recast
1 reaction

typeof.eth πŸ”΅ pfp
typeof.eth πŸ”΅
@typeof.eth
Like, this isn’t how this works at all. Cursor IDE
1 reply
0 recast
2 reactions

Dean Pierce πŸ‘¨β€πŸ’»πŸŒŽπŸŒ pfp
Dean Pierce πŸ‘¨β€πŸ’»πŸŒŽπŸŒ
@deanpierce.eth
I have triaged ChatGPT bug reports on Immunefi for SQL injections in the middle of solidity smart contracts πŸ˜„ I do really like auditing code with Cursor though, being able to feed the entire codebase into the context window and ask questions is super useful. It's like working with a hyper observant five year old.
0 reply
0 recast
1 reaction