TokenCalculator.com
Home Models Alibaba Qwen3 235B MoE (128k)

Alibaba Qwen3 235B MoE (128k)

Alibaba 128K context Released: 2025-04

Alibaba's flagship Qwen3 Mixture-of-Experts model with 235B total parameters (22B active). Features hybrid reasoning and supports 119 languages. (Note: Not publicly available at release).

Visit Model Website

Pricing Information

Input Pricing

Standard: $0.0000
Per 1,000 tokens

Output Pricing

Standard: $0.0000
Per 1,000 tokens

Example Costs

Short Conversation
1K input + 500 output tokens
$0.0000
Book Analysis
50K input + 2K output tokens
$0.00

Token Calculator

Tokens: 0
Words: 0
Characters: 0
Input Cost: $0.00
Estimated based on current token count as input
Use our main calculator for more detailed estimates including input/output combinations.

Key Features

  • 235 billion total parameters (22B active)
  • Mixture-of-Experts (MoE) architecture
  • Hybrid reasoning (thinking/non-thinking modes)
  • 128K context window
  • Supports 119 languages and dialects
  • Trained on 36 trillion tokens

Common Use Cases

  • Complex problem solving
  • Advanced coding and math
  • Multilingual applications
  • Tool use and agent capabilities

Frequently Asked Questions

What is a token in the context of Large Language Models (LLMs)?

Why is understanding token count important for using LLMs?

What factors affect LLM pricing?

What is a context window in LLMs?

How can I optimize my prompts to use fewer tokens?