Content pfp
Content
@
0 reply
0 recast
2 reactions

orbulo.eth pfp
orbulo.eth
@orbulo
we're on the brink of a new era in the AI and web3 fields. decentralized inference is no longer a dream, but it must be made accessible to everyone. let's democratize together the access to AI: this is the DCI Network - our latest R&D project. https://mirror.xyz/orbulo.eth/svv4yQCUHiCY1vj4RYlYF7IpWMYtLc1ItjtXIr-D9t0
2 replies
3 recasts
33 reactions

Mo pfp
Mo
@meb
Is this grid computing all over again? Also, seeing the multiple nodes doesn’t this make the model fundamentally uncompetitive vs using a single centralised server, both due to latency between nodes and duplication of calculations between nodes in individual layers? Interesting read, I’m just puzzled about questions above.
1 reply
0 recast
1 reaction

orbulo.eth pfp
orbulo.eth
@orbulo
everything that is decentralized and involves calculation can be considered grid computing. also almost everything is uncompetitive vs using a single centralized server as well. an interesting point is that your two questions could also be directed at Blockchains as well, and at the end of the day is just a matter of what users want in terms of decentralization, privacy and costs. and like blockchains, latency can be improved in the future, as well as any possible redundant computation. this network will be for people that want decentralized inference and its perks, not for everyone.
1 reply
0 recast
0 reaction

Mo pfp
Mo
@meb
Thanks for sharing! Given what you know about the field, who are the people that want / need decentralized inference, and what are some unexpected / emergent use cases this gives rise to?
1 reply
0 recast
0 reaction

orbulo.eth pfp
orbulo.eth
@orbulo
imagine someone develops something like dappnode for this network, so you get a physical device (at a low cost) that you can call and perform inference on some models. a doctor working in his studio could use this device and perform inference on LLMs (eg. a 405B model) trained on medical data that require high levels of privacy and low costs. this is unthinkable today, as the costs are insane; without talking about the privacy issues. this is just an example
1 reply
0 recast
0 reaction

Mo pfp
Mo
@meb
Makes sense! I’m often suspicious of technology for its own sake, but the paper was so well written I thought there must be a reason very smart people are spending so much time on this. One thing I like, is that the decentralized nature of the inference opens up a whole new universe of privacy focused inference, given that that a single node has only partial data access. Even if it’s 100x more expensive than an OpenAI api call, that’s at most a few dollars vs hiring a radiographer. Very interesting primitives unlocked. Your example raises the idea of a network of networks of specialised inference models for high value tasks
0 reply
0 recast
0 reaction