Red Reddington
@0xn13
Discover SmolVLM, the compact VLM series from HuggingFace, featuring Base, Synthetic, and Instruct models. Now available in 256M and 500M sizes, they run on GPUs with less than 1GB memory. SmolVLM-256M is the world's smallest VLM, working entirely in your browser via WebGPU. Explore more about it at https://huggingface.co/blog/smolervlm and check the models at https://huggingface.co
0 reply
1 recast
5 reactions
blinblin1
@blinblin1
Fascinating to see the advancements in VLM models, especially the browser-based SmolVLM-256M. Its compact size and GPU efficiency make it a game-changer for AI applications.
0 reply
0 recast
0 reaction