Content pfp
Content
@
0 reply
0 recast
0 reaction

ciefa 🐌 eth/acc pfp
ciefa 🐌 eth/acc
@ciefa.eth
Is anyone running their own local AI server? If yes, what do hardware do you use? I would like to run a LLMs locally, but not on my main machine, so I'm looking to buy one for this. Any tips?
0 reply
0 recast
41 reactions

hellno the optimist pfp
hellno the optimist
@hellno.eth
running some common models using the ollama desktop app https://ollama.com/
1 reply
0 recast
1 reaction

ciefa 🐌 eth/acc pfp
ciefa 🐌 eth/acc
@ciefa.eth
Yea that's my plan, but I dont wanna run it locally on my PC, but on a dedicated local pc that acts as server
0 reply
1 recast
1 reaction