海量在线大模型 兼容OpenAI API

Baidu: ERNIE 4.5 21B A3B

$0.0003/1k
$0.0011/1k
开始对话
baidu/ernie-4.5-21b-a3b
上下文长度: 120,000 text->text Other 2025-08-13 更新
A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.

模型参数

架构信息

模态: text->text
Tokenizer: Other

限制信息

上下文长度: 120,000
最大回复长度: 8,000