Content
@
https://opensea.io/collection/dev-21
0 reply
0 recast
2 reactions
Arti Villa
@artivilla.eth
what's the best way to do this? > knowledge base for a local LLM running completely offline, privately on my own computer
2 replies
0 recast
4 reactions
Zach
@zd
if you use obsidian, there's a great plugin called "smart composer" that adds a cursor composer style chat window to your obsidian vault you can choose which model you use (looks like ollama is available, which i think is local?)
0 reply
0 recast
2 reactions