Content
@
0 reply
0 recast
0 reaction
christin
@christin
@bleu.eth ‘s recommendation is super legit I can run LLM (mini version of llama) on my MacBook Air offline with lmstudio.ai (there’s also more heavy duty models that are one-click in-app download if you have a better computer than me) great for “simple” tasks like summarizing my voice memo rambles
4 replies
7 recasts
31 reactions
Mikado
@mikadoe.eth
Hmmm 2 days and no cast from Christin.... Are you okay?
1 reply
0 recast
1 reaction
BBB // M⬆️
@jianbing.eth
agreed++ another tool folks can do this with: https://ollama.com/search i have llama3.2 and mistral available offline and in CLI -- very useful!
0 reply
0 recast
2 reactions
Cool Beans 🌞
@coolbeans1r.eth
I run local using Ollama. 👍 I like QwQ. Check it out! ✌️❤️
0 reply
0 recast
1 reaction
TheModestThief🎩
@thief
what's your MBA's specs
0 reply
0 recast
1 reaction