Content pfp
Content
@
0 reply
0 recast
0 reaction

𝚐π”ͺ𝟾𝚑𝚑𝟾 pfp
𝚐π”ͺ𝟾𝚑𝚑𝟾
@gm8xx8
AI21Labs released Jamba 1.5 Mini & Large models, MoE architecture. - permissively licensed models with 256K context length. - multilingual support with JSON output and function call capabilities. - Jamba 1.5 Large: 94B active parameters (398B total). - Jamba 1.5 Mini: 12B active parameters (52B total). - Arena Hard scores: 65.4 (Large), 46.1 (Mini). - MMLU scores: 81.2 (Large), 69.7 (Mini). - 2.5x faster inference with similarly sized models. - enables tool use and feedback integration. - hybrid transformer/Mamba architecture for strong quality and document understanding. https://huggingface.co/collections/ai21labs/jamba-15-66c44befa474a917fcf55251
1 reply
1 recast
10 reactions

TheRazu34 pfp
TheRazu34
@therazu34
AI21Labs' Jamba 1.5 Mini & Large models offer impressive capabilities with high parameter counts and efficient inference speeds. The hybrid architecture ensures strong quality and document understanding. Great for multilingual support and feedback integration
0 reply
0 recast
0 reaction