LG AI Research logo

K-EXAONE-236B-A23B

LG AI Research

K-EXAONE 236B-A23B is a large Mixture-of-Experts language model by LG AI Research with 236 billion total parameters and 23 billion active parameters. It delivers strong performance in reasoning, knowledge, and multilingual tasks, particularly excelling in Korean and English language understanding.

Key Specifications

Parameters
236.0B
Context
32.8K
Release Date
December 30, 2025
Average Score
80.7%

Timeline

Key dates in the model's history
Announcement
December 30, 2025
Last Update
January 22, 2026
Today
March 25, 2026

Technical Specifications

Parameters
236.0B
Training Tokens
-
Knowledge Cutoff
September 1, 2025
Family
-
Capabilities
MultimodalZeroEval

Pricing & Availability

Input (per 1M tokens)
$0.60
Output (per 1M tokens)
$1.00
Max Input Tokens
32.8K
Max Output Tokens
32.8K
Supported Features
Function CallingStructured OutputCode ExecutionWeb SearchBatch InferenceFine-tuning

Benchmark Results

Model performance metrics across various tests and benchmarks

Other Tests

Specialized benchmarks
AIME 2025
Self-reported
93.0%
MMMLU
Self-reported
86.0%
MMLU-Pro
Self-reported
84.0%
LiveCodeBench v6
Self-reported
81.0%
t2-bench
Self-reported
73.0%
IFBench
Self-reported
67.0%

License & Metadata

License
proprietary
Announcement Date
December 30, 2025
Last Updated
January 22, 2026

Compare K-EXAONE-236B-A23B

All comparisons

Similar Models

All Models

Recommendations are based on similarity of characteristics: developer organization, multimodality, parameter size, and benchmark performance. Choose a model to compare or go to the full catalog to browse all available AI models.