Content
@
0 reply
0 recast
0 reaction
Ben
@benersing
Who is locally running an AI model? - Which model(s)? - Whatβs your tech stack? - Will you keep doing it?
3 replies
0 recast
10 reactions
christin
@christin
I use lmstudio per @bleu.eth βs recommendation my MacBook Air can only run llama mini π which is better than nothing when Iβm off the grid
2 replies
1 recast
5 reactions
Ben
@benersing
Do you use cloud based as well?
1 reply
0 recast
2 reactions
christin
@christin
I overpay for all the trendy ones π highest usages are chatGPT for ideas / copy, perplexity for search, and Gemini pro for weird multimodal experiments (h/t @keccers.eth for the last one)
1 reply
0 recast
3 reactions
keccers
@keccers.eth
From work slack yday
1 reply
0 recast
3 reactions
christin
@christin
yeah builder friends are pretty excited about flash bc it has good cost ratio (even when accounting for deepseek)
0 reply
0 recast
2 reactions