海量在线大模型 兼容OpenAI API

Mistral: Mistral Small 3

$0.0003/1k
$0.0006/1k
开始对话
mistralai/mistral-small-24b-instruct-2501
上下文长度: 32,768 text->text Mistral 2025-01-31 更新
Mistral Small 3 is a 24B-parameter language model optimized for low-latency performance across common AI tasks. Released under the Apache 2.0 license, it features both pre-trained and instruction-tuned versions designed for efficient local deployment. The model achieves 81% accuracy on the MMLU benchmark and performs competitively with larger models like Llama 3.3 70B and Qwen 32B, while operating at three times the speed on equivalent hardware. Read the blog post about the model here.

模型参数

架构信息

模态: text->text
Tokenizer: Mistral

限制信息

上下文长度: 32,768