Nat Emodi
@emodi
Saw an impressive demo yesterday of a locally run, fully offline LLM β 13bn params, 8GB running on an M2 Macbook browser In the next couple years weβll see access to human-level intelligence across a lot of the worldβs information without need for internet
1 reply
0 recast
2 reactions
π_π
@m-j-r.eth
think of the Mixture of (locally hosted) Experts we'll have as well. right now the trend is near-instaneous single-player 0-shot, what happens when we revert to async, git-style, & multiplayer-edited output for common ideas?
0 reply
0 recast
1 reaction