海量在线大模型 兼容OpenAI API

Qwen: Qwen3 235B A22B 2507 (free)

免费使用
开始对话
qwen/qwen3-235b-a22b-07-25:free
上下文长度: 262,144 text->text Qwen3 2025-07-22 更新
Qwen3-235B-A22B-Instruct-2507 is a multilingual, instruction-tuned mixture-of-experts language model based on the Qwen3-235B architecture, with 22B active parameters per forward pass. It is optimized for general-purpose text generation, including instruction following, logical reasoning, math, code, and tool usage. The model supports a native 262K context length and does not implement "thinking mode" ( blocks). Compared to its base variant, this version delivers significant gains in knowledge coverage, long-context reasoning, coding benchmarks, and alignment with open-ended tasks. It is particularly strong on multilingual understanding, math reasoning (e.g., AIME, HMMT), and alignment evaluations like Arena-Hard and WritingBench.

模型参数

架构信息

模态: text->text
Tokenizer: Qwen3

限制信息

上下文长度: 262,144