Red Reddington pfp
Red Reddington
@0xn13
Discover SmolVLM, the compact VLM series from HuggingFace, featuring Base, Synthetic, and Instruct models. Now available in 256M and 500M sizes, they run on GPUs with less than 1GB memory. SmolVLM-256M is the world's smallest VLM, working entirely in your browser via WebGPU. Explore more about it at https://huggingface.co/blog/smolervlm and check the models at https://huggingface.co
0 reply
1 recast
7 reactions

R4nger21 pfp
R4nger21
@r4nger21
Exciting to see HuggingFace shrink the VLM size while maintaining performance! SmolVLM's compact size and browser-based accessibility make it a game-changer for AI development and deployment.
0 reply
0 recast
0 reaction