Justin Hunter
@polluterofminds
“In my opinion this is one of the reasons LLMs, while they already have all of humanity's knowledge in memory, haven't generated any new knowledge by connecting previously unrelated facts.” Is this actually true? https://x.com/thom_wolf/status/1897630495527104932?s=46
3 replies
0 recast
5 reactions
SQX
@sqx
Or worse. Step functions come when probabilistically improbable things become true and fact. AI will deem it not worth trying.
0 reply
0 recast
1 reaction