Introducing Qwen3.6-35B-A3B: A Sparse MoE Model with Exceptional Coding Power
AI Summary
I'm thrilled to announce the open-sourcing of Qwen3.6-35B-A3B, a sparse mixture-of-experts (MoE) model that boasts 35 billion total parameters, with only 3 billion active at any time. Despite its efficiency, it outperforms its predecessor, Qwen3.5-35B-A3B, and competes with larger dense models like Qwen3.5-27B and Gemma4-31B. This model supports both multimodal thinking and non-thinking modes, making it one of the most versatile open-source models available today. You can access it via Qwen Studio, our API, or download the open weights from platforms like Hugging Face and ModelScope.
## Performance
Qwen3.6-35B-A3B excels in various coding benchmarks, surpassing the dense 27B-parameter Qwen3.5-27B and dramatically outperforming its predecessor in agentic coding and reasoning tasks. For example, it scores 73.4 on the SWE-bench Verified and 51.5 on Terminal-Bench 2.0. Its performance is impressive across a range of tasks, including coding, general agent tasks, and knowledge-based evaluations.
In vision-language tasks, Qwen3.6-35B-A3B showcases exceptional multimodal reasoning capabilities, often matching or surpassing models like Claude Sonnet 4.5. It achieves high scores in spatial intelligence tasks, such as 92.0 on RefCOCO and 50.8 on ODInW13, demonstrating its strengths in perception and reasoning.
## Build with Qwen3.6-35B-A3B
Qwen3.6-35B-A3B is soon to be available on Alibaba Cloud Model Studio, and you can already access it through various platforms. It integrates seamlessly with popular coding assistants like OpenClaw, Claude Code, and Qwen Code, enhancing development workflows with efficient, context-aware coding experiences.
### API Usage
The model supports the 'preserve_thinking' feature, which is recommended for agentic tasks. The API is compatible with industry-standard protocols, including those of OpenAI and Anthropic, allowing for easy integration into existing systems.
### Coding & Agents
Qwen3.6-35B-A3B's agentic coding capabilities make it compatible with OpenClaw, a self-hosted AI coding agent, and Qwen Code, an open-source AI agent optimized for the Qwen Series. These integrations provide a robust coding experience, leveraging the model's strengths in reasoning and multimodal tasks.
## Summary
Qwen3.6-35B-A3B sets a new standard for sparse MoE models, achieving remarkable performance with only 3 billion active parameters. It rivals much larger dense models and excels across multimodal benchmarks. As a fully open-source model, it invites the community to explore its capabilities and contribute to its development. We look forward to seeing the innovative applications built with Qwen3.6-35B-A3B and will continue to expand the Qwen3.6 open-source family.
Key Concepts
A type of neural network architecture that uses a collection of expert models, where only a subset is activated for any given input. This approach allows for efficient computation and scalability.
A coding approach that involves autonomous decision-making and reasoning capabilities, allowing models to perform complex tasks with minimal human intervention.
Category
AIOriginal source
https://qwen.ai/blog?id=qwen3.6-35b-a3bMore on Discover
Summarized by Mente
Save any article, video, or tweet. AI summarizes it, finds connections, and creates your to-do list.
Start free, no credit card