Kimi K2 0905
Kimi K2 0905 is the September update to Kimi K2 0711. It is a large-scale Mixture-of-Experts (MoE) language model developed by Moonshot AI, with 1 trillion parameters and 32 billion active parameters per forward pass. It supports long-context inference up to 256K tokens, extended from 128K. This update improves agentic coding with greater accuracy and better generalization across different frameworks, and enhances frontend development with more aesthetic and functional outputs for web, 3D, and related tasks. The model is trained using a new stack featuring the MuonClip optimizer for stable large-scale MoE training.
Key Specifications
Timeline
Technical Specifications
Pricing & Availability
Benchmark Results
Model performance metrics across various tests and benchmarks
General Knowledge
Programming
Mathematics
Reasoning
Other Tests
License & Metadata
Similar Models
All ModelsKimi K2 Instruct
Moonshot AI
Kimi K2-Instruct-0905
Moonshot AI
Kimi K2-Thinking-0905
Moonshot AI
Kimi K2 Base
Moonshot AI
Kimi K2.5
Moonshot AI
GLM-4.5
Zhipu AI
Llama 3.1 405B Instruct
Meta
Qwen3 235B A22B
Alibaba
Recommendations are based on similarity of characteristics: developer organization, multimodality, parameter size, and benchmark performance. Choose a model to compare or go to the full catalog to browse all available AI models.