Alibaba logo

Qwen3 235B A22B

Alibaba

Qwen3 235B A22B is a large language model from Alibaba with a Mixture-of-Experts (MoE) architecture, containing 235 billion total parameters and 22 billion active parameters. Achieves competitive results on coding, math, general capabilities, and other benchmarks compared to other top models.

Key Specifications

Parameters
235.0B
Context
128.0K
Release Date
April 28, 2025
Average Score
90.8%

Timeline

Key dates in the model's history
Announcement / Last Update
April 28, 2025
Today
March 25, 2026

Technical Specifications

Parameters
235.0B
Training Tokens
36.0T tokens
Knowledge Cutoff
-
Family
-
Capabilities
MultimodalZeroEval

Pricing & Availability

Input (per 1M tokens)
$0.20
Output (per 1M tokens)
$0.60
Max Input Tokens
128.0K
Max Output Tokens
128.0K
Supported Features
Function CallingStructured OutputCode ExecutionWeb SearchBatch InferenceFine-tuning

Benchmark Results

Model performance metrics across various tests and benchmarks

General Knowledge

Tests on general knowledge and understanding
MMLU
AccuracySelf-reported
88.0%

Mathematics

Mathematical problems and computations
GSM8k
AccuracySelf-reported
94.0%

Other Tests

Specialized benchmarks
Arena Hard
AccuracySelf-reported
96.0%
BBH
AccuracySelf-reported
89.0%
MMMLU
AccuracySelf-reported
87.0%

License & Metadata

License
apache-2.0
Announcement Date
April 28, 2025
Last Updated
April 28, 2025

Compare Qwen3 235B A22B

All comparisons

Articles about Qwen3 235B A22B

Similar Models

All Models

Recommendations are based on similarity of characteristics: developer organization, multimodality, parameter size, and benchmark performance. Choose a model to compare or go to the full catalog to browse all available AI models.