Content
@
0 reply
0 recast
0 reaction
Varun Srinivasan
@v
Is there a credible solution to the LLM hallucination problem? Any interesting research papers or discussions on this?
9 replies
2 recasts
45 reactions
Stephan
@stephancill
How prevalent is it in coding applications? Feels like I never experience LLM hallucination but coding is pretty much my only use case
1 reply
0 recast
0 reaction
typeof.eth π΅
@typeof.eth
I get a lot of hallucinations for Solidity specifically, and a the odd hallucination with newer libraries
2 replies
0 recast
1 reaction
typeof.eth π΅
@typeof.eth
Like, this isnβt how this works at all. Cursor IDE
1 reply
0 recast
2 reactions
Dean Pierce π¨βπ»ππ
@deanpierce.eth
I have triaged ChatGPT bug reports on Immunefi for SQL injections in the middle of solidity smart contracts π I do really like auditing code with Cursor though, being able to feed the entire codebase into the context window and ask questions is super useful. It's like working with a hyper observant five year old.
0 reply
0 recast
1 reaction