Nat Emodi pfp
Nat Emodi
@emodi
Saw an impressive demo yesterday of a locally run, fully offline LLM β€” 13bn params, 8GB running on an M2 Macbook browser In the next couple years we’ll see access to human-level intelligence across a lot of the world’s information without need for internet
1 reply
0 recast
2 reactions

π’‚­_π’‚­ pfp
π’‚­_π’‚­
@m-j-r.eth
think of the Mixture of (locally hosted) Experts we'll have as well. right now the trend is near-instaneous single-player 0-shot, what happens when we revert to async, git-style, & multiplayer-edited output for common ideas?
0 reply
0 recast
1 reaction