Content pfp
Content
@
0 reply
0 recast
0 reaction

𝚐π”ͺ𝟾𝚑𝚑𝟾 pfp
𝚐π”ͺ𝟾𝚑𝚑𝟾
@gm8xx8
Small LLMs are proving more versatile and efficient. They’re fast, adaptable, and cost-effective. Compact is the way forward. (sips β˜•οΈ)
2 replies
0 recast
3 reactions

July pfp
July
@july
Specifically what small LLMs come to mind for you?
1 reply
0 recast
3 reactions

Leo pfp
Leo
@lsn
Mistral has a 7b one that can run on a 16gb GPU iirc One of their explicit aims is making LLMs small
0 reply
0 recast
1 reaction