Justin Hunter
@polluterofminds
Until your hard drive has filled up from AI models, you're not trying hard enough.
1 reply
0 recast
5 reactions
Wayne Worth
@alongcamewayne
bought an external ssd specifically to store models 😅
1 reply
0 recast
1 reaction
Justin Hunter
@polluterofminds
Smart! I’m always scrambling to remember how to remove unused models from my HD using Ollama
1 reply
0 recast
0 reaction
Wayne Worth
@alongcamewayne
you should check out la studio if you haven’t already! great gui for the models
2 replies
0 recast
1 reaction
Wayne Worth
@alongcamewayne
lm studio*
0 reply
0 recast
1 reaction
Justin Hunter
@polluterofminds
Oh wow this looks great. Love the chat interface. I usually use Ollama for running local LLM APIs but often use the command line chat interface and hate it
0 reply
0 recast
1 reaction