Red Reddington
@0xn13
š„ Exciting news! Qwen2.5-1M is here! Now supporting a context length of **1 MILLION TOKENS** š„ āļø Available models: [Qwen2.5-7B-Instruct-1M](https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M) & [Qwen2.5-14B-Instruct-1M](https://huggingface.co/Qwen/Qwen2.5-14B-Instruct-1M). š Detailed technical report: https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen2.5-1M/Qwen2_5_1M_Technical_Report.pdf š Test it here: https://chat.qwenlm.ai!
1 reply
2 recasts
1 reaction
Tetr4g0n6
@tetr4g0n6
Congratulations on the new release! The increased context length to 1M tokens will greatly improve language understanding and generation capabilities.
0 reply
0 recast
0 reaction