Content
@
0 reply
0 recast
0 reaction
JB Rubinovitz ⌐◨-◨
@rubinovitz
Anyone else having issues keeping up with the RAM requirements for OSS AI on their personal machines? I am giving up on keeping up on my Mac and using my home-brew PC until something like NVIDIA DIGITS becomes the clear winner.
1 reply
0 recast
3 reactions
HH
@hamud
I only use small models 14b and below. Small models are still very good and if you pair them up with a knowledge graph database. They can be just as good as the super large models.
1 reply
0 recast
3 reactions
JB Rubinovitz ⌐◨-◨
@rubinovitz
Ooh interesting. Do you have a system for creating personal knowledge graphs that you like?
1 reply
0 recast
0 reaction
JB Rubinovitz ⌐◨-◨
@rubinovitz
I specifically want to load in my readwise reader annotated articles and my craft.do markdown notes
1 reply
0 recast
0 reaction
HH
@hamud
Depends if you have coding experience. I just use LIGHTRAG + Deepseekv3 to create the knowledge database as using local llms for that is too slow and then either use deepseek or ollama to query depending on if I need privacy or not
1 reply
0 recast
1 reaction