Qwen2.5-Coder is the latest series of code-specific large language models from Alibaba Cloud, available in six model sizes ranging from 0.5 to 32 billion parameters. This 7B instruction-tuned variant features significant improvements in code generation, code reasoning, and code fixing, built on the strong foundation of Qwen2.5 with 5.5 trillion training tokens including source code, text-code grounding, and synthetic data. The model supports long-context inputs up to 128K tokens and maintains strong capabilities not only in coding but also in mathematics and general competencies, making it suitable for real-world applications such as code agents. With performance matching GPT-4o on coding tasks, Qwen2.5-Coder-32B has become a state-of-the-art open-source code language model.
Qwen2.5-Coder is the latest series of code-specific large language models from Alibaba Cloud, available in six model sizes ranging from 0.5 to 32 billion parameters. This 7B instruction-tuned variant features significant improvements in code generation, code reasoning, and code fixing, built on the strong foundation of Qwen2.5 with 5.5 trillion training tokens including source code, text-code grounding, and synthetic data. The model supports long-context inputs up to 128K tokens and maintains strong capabilities not only in coding but also in mathematics and general competencies, making it suitable for real-world applications such as code agents. With performance matching GPT-4o on coding tasks, Qwen2.5-Coder-32B has become a state-of-the-art open-source code language model.