Mistral Large 3 (675B Instruct 2512)
MultimodalMistral Large 3 is a multimodal Mixture-of-Experts model with 41B active parameters and 675B total parameters, trained from scratch on 3,000 H200 GPUs. This is an FP8 instruct version fine-tuned for instruction tasks, ideal for chat, agentic, and instruction-based scenarios. Designed for reliability and long-context understanding, suitable for production-grade assistants, RAG systems, scientific workloads, and complex enterprise workflows.
Key Specifications
Timeline
Technical Specifications
Pricing & Availability
Benchmark Results
Model performance metrics across various tests and benchmarks
Reasoning
Other Tests
License & Metadata
Similar Models
All ModelsPixtral Large
Mistral AI
Mistral Small 3.1 24B Instruct
Mistral AI
Mistral Small 3.1 24B Base
Mistral AI
Magistral Medium
Mistral AI
Mistral Small 3.2 24B Instruct
Mistral AI
Mistral Small 3 24B Base
Mistral AI
Pixtral-12B
Mistral AI
Mistral Large 2
Mistral AI
Recommendations are based on similarity of characteristics: developer organization, multimodality, parameter size, and benchmark performance. Choose a model to compare or go to the full catalog to browse all available AI models.