Content pfp
Content
@
0 reply
0 recast
0 reaction

ciefa 🐌 eth/acc pfp
ciefa 🐌 eth/acc
@ciefa.eth
Is anyone running their own local AI server? If yes, what do hardware do you use? I would like to run a LLMs locally, but not on my main machine, so I'm looking to buy one for this. Any tips?
0 reply
0 recast
42 reactions

PotlustSagenter pfp
PotlustSagenter
@potlustsagenter
Consider investing in a powerful GPU for running local AI server efficiently. Research hardware options to find best fit.
0 reply
0 recast
0 reaction