Content pfp
Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions

Arti Villa pfp
Arti Villa
@artivilla.eth
what's the best way to do this? > knowledge base for a local LLM running completely offline, privately on my own computer
2 replies
0 recast
4 reactions

Frank pfp
Frank
@deboboy
Open WebUI has a KB configuration you can load and run locally against any model downloaded via Ollama [or other gateways]; entire stack is private to your machine
1 reply
0 recast
0 reaction

Zach pfp
Zach
@zd
if you use obsidian, there's a great plugin called "smart composer" that adds a cursor composer style chat window to your obsidian vault you can choose which model you use (looks like ollama is available, which i think is local?)
0 reply
0 recast
2 reactions