Kimi K2 Base
Kimi K2 Base is a cutting-edge language model built on the mixture-of-experts (MoE) architecture with 32 billion activated parameters and 1 trillion total parameters. Trained on 15.5 trillion tokens using the MuonClip optimizer, this is a foundational model prior to instruction tuning. It demonstrates strong performance on knowledge, reasoning, and coding benchmarks while being optimized for agentic capabilities.
Key Specifications
Timeline
Technical Specifications
Benchmark Results
Model performance metrics across various tests and benchmarks
General Knowledge
Mathematics
Reasoning
Other Tests
License & Metadata
Similar Models
All ModelsKimi K2 0905
Moonshot AI
Kimi K2 Instruct
Moonshot AI
Kimi K2-Instruct-0905
Moonshot AI
Kimi K2-Thinking-0905
Moonshot AI
Command R+
Cohere
Jamba 1.5 Large
AI21 Labs
DeepSeek-V3 0324
DeepSeek
DeepSeek-V3.1
DeepSeek
Recommendations are based on similarity of characteristics: developer organization, multimodality, parameter size, and benchmark performance. Choose a model to compare or go to the full catalog to browse all available AI models.