MiniMax M2.7 Will Go Open Weights Within Two Weeks
MiniMax head of engineering confirms M2.7, the self-evolving AI model, will be released as open weights. Here's what that means for developers.
Five days after launching M2.7 as a proprietary model, MiniMax has reversed course. Skyler Miao, the company's head of engineering, confirmed on social media that M2.7 will be released as open weights "in about two weeks."
From Closed to Open
When M2.7 launched on March 18, it was a departure from MiniMax's open-source track record. The company had previously released M2.5 with open weights, earning goodwill in the developer community. Going proprietary raised eyebrows — VentureBeat even framed it as evidence that Chinese AI startups were drifting toward the closed-model playbook pioneered by OpenAI and Anthropic.
That narrative lasted less than a week. The open-weights confirmation drew over 600 upvotes on r/LocalLLaMA, with developers already discussing fine-tuning strategies and local deployment plans.
M2.7 is not just another language model. It's the first major model to participate meaningfully in its own training process, handling 30–50% of its reinforcement learning workflow autonomously. It scored 56.22% on SWE-Pro (matching GPT-5.3-Codex), achieved a 66.6% medal rate on MLE Bench Lite, and costs just $0.30 per million input tokens — making it one of the cheapest frontier models available.
The model also delivered a striking improvement in hallucination: a +1 score on the AA-Omniscience Index, up from -40 for its predecessor. In practical terms, M2.7's hallucination rate of 34% is lower than both Claude Sonnet 4.6 (46%) and Gemini 3.1 Pro (50%).
Why This Matters
Open weights for a self-evolving model is genuinely new territory. Developers will be able to inspect, fine-tune, and build on a model that was partly trained by itself — an approach that could accelerate research in ways proprietary access simply can't. With OpenClaw hitting 250K stars and the broader agentic AI ecosystem booming, M2.7's weights could become a foundation for a new wave of custom agent development.
The timing also puts pressure on competitors. Alibaba just recommitted to open-sourcing Qwen, and the Chinese open-weights race is very much alive.