Content pfp
Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions

Arti Villa pfp
Arti Villa
@artivilla.eth
what's the best way to do this? > knowledge base for a local LLM running completely offline, privately on my own computer
2 replies
0 recast
4 reactions

Frank pfp
Frank
@deboboy
Open WebUI has a KB configuration you can load and run locally against any model downloaded via Ollama [or other gateways]; entire stack is private to your machine
1 reply
0 recast
0 reaction

Arti Villa pfp
Arti Villa
@artivilla.eth
yeah I just went with https://lmstudio.ai. Devs often underestimate all required features being equal, running llms locally and price, design is the only distinction that makes me pick one over the other.
1 reply
0 recast
0 reaction