Content
@
0 reply
0 recast
0 reaction
christin
@christin
@bleu.eth ‘s recommendation is super legit I can run LLM (mini version of llama) on my MacBook Air offline with lmstudio.ai (there’s also more heavy duty models that are one-click in-app download if you have a better computer than me) great for “simple” tasks like summarizing my voice memo rambles
4 replies
8 recasts
31 reactions
Cool Beans 🌞
@coolbeans1r.eth
I run local using Ollama. 👍 I like QwQ. Check it out! ✌️❤️
0 reply
0 recast
1 reaction