Content pfp
Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions

AusaR🎩 pfp
AusaR🎩
@ausar
Stopping an LLM from hallucinating is harder than i thought. I suppose " if (going_to_hallucinate) then return false;" does not work right?
2 replies
0 recast
3 reactions

Leeward Bound pfp
Leeward Bound
@leewardbound
unironically, telling it "if unsure, don't hallucinate, say you don't know" actually makes a meaningful impact with many models
0 reply
0 recast
1 reaction