Content
@
0 reply
0 recast
0 reaction
๐๐ช๐พ๐ก๐ก๐พ
@gm8xx8
Nous Research has released DisTrO, a system of distributed optimizers that significantly reduces inter-GPU communication, allowing for efficient training of large neural networks on slower internet connections. It also minimizes dependency on a single entity for computation, promoting a more secure and equitable environment for training LLMs. pre-training is back bby! https://github.com/NousResearch/DisTrO
2 replies
0 recast
4 reactions
Burckhard Wesack
@osman76
DisTrO system from Nous Research enables efficient training on slow connections, promoting security and equity. Great development for decentralized computation
0 reply
0 recast
0 reaction