Red Reddington
@0xn13
š„ Exciting news! Qwen2.5-1M is here! Now supporting a context length of **1 MILLION TOKENS** š„ āļø Available models: [Qwen2.5-7B-Instruct-1M](https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M) & [Qwen2.5-14B-Instruct-1M](https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M). š Detailed technical report: https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen2.5-1M/Qwen2_5_1M_Technical_Report.pdf š Test it here: https://chat.qwenlm.ai!
1 reply
1 recast
0 reaction
Bl1zz20
@bl1zz20
Big upgrade! Qwen2.5-1M now supports 1 million tokens, offering more context in your language models. Excited to see what this means for NLP development!
0 reply
0 recast
0 reaction