Content pfp
Content
@
0 reply
0 recast
0 reaction

ciefa 🐌 eth/acc pfp
ciefa 🐌 eth/acc
@ciefa.eth
Is anyone running their own local AI server? If yes, what do hardware do you use? I would like to run a LLMs locally, but not on my main machine, so I'm looking to buy one for this. Any tips?
0 reply
0 recast
41 reactions

Maks pfp
Maks
@maximus007
Why do you need to run it locally? You can use Public Cloud providers and select any model you wants to run and pay for compute resources only. Its much easier E.g. Amazon Bedrock
1 reply
0 recast
0 reaction

ciefa 🐌 eth/acc pfp
ciefa 🐌 eth/acc
@ciefa.eth
I want to run it locally
0 reply
0 recast
0 reaction