Content
@
https://warpcast.com/~/channel/aichannel
0 reply
0 recast
0 reaction
Kasra Rahjerdi
@jc4p
my new default for at home local llm: https://huggingface.co/google/gemma-3-12b-it-qat-q4_0-gguf -- Google provided quantizing/GGUF
3 replies
3 recasts
52 reactions
christopher
@christopher
What kind of machine specs?
1 reply
0 recast
8 reactions
AzFlin
@azflin
why use local llm? just to save on credits ?
1 reply
0 recast
1 reaction
Kapaskie
@kapaskie
Gemma’s been super smooth for local runs lately. Google really made it easy with the quantized versions too👍
0 reply
0 recast
0 reaction