Codestral-22B-v0.1 is a 22 billion parameter code generation model trained on over 80 programming languages including Python, Java, C, C++, JavaScript, and Bash. The model can be used in two modes: as an instruct model to answer questions about code snippets, write documentation, explain code, or generate code following specific instructions, and as a Fill in the Middle model to predict tokens between a prefix and suffix, making it useful for software development tools like VS Code extensions. The model does not include moderation mechanisms and is released under the MNLP-0.1 license.
Codestral-22B-v0.1 is a 22 billion parameter code generation model trained on over 80 programming languages including Python, Java, C, C++, JavaScript, and Bash. The model can be used in two modes: as an instruct model to answer questions about code snippets, write documentation, explain code, or generate code following specific instructions, and as a Fill in the Middle model to predict tokens between a prefix and suffix, making it useful for software development tools like VS Code extensions. The model does not include moderation mechanisms and is released under the MNLP-0.1 license.