海量在线大模型 兼容OpenAI API

全部大模型

349个模型 · 2025-12-17 更新
SorcererLM 8x22B
$0.018/1k
$0.018/1k
raifle/sorcererlm-8x22b
SorcererLM is an advanced RP and storytelling model, built as a Low-rank 16-bit LoRA fine-tuned on WizardLM-2 8x22B. Advanced reasoning and emotional intelligence for engaging and immersive interactions Vivid writing capabilities enriched with spatial and contextual awareness Enhanced narrative depth, promoting creative and dynamic storytelling
2024-11-09 16,000 text->text Mistral
Mistral: Voxtral Small 24B 2507
$0.0004/1k
$0.0012/1k
mistralai/voxtral-small-24b-2507
Voxtral Small is an enhancement of Mistral Small 3, incorporating state-of-the-art audio input capabilities while retaining best-in-class text performance. It excels at speech transcription, translation and audio understanding. Input audio is priced at $100 per million seconds.
2025-10-30 32,000 text->text Mistral
Mistral: Saba
$0.0008/1k
$0.0024/1k
mistralai/mistral-saba
Mistral Saba is a 24B-parameter language model specifically designed for the Middle East and South Asia, delivering accurate and contextually relevant responses while maintaining efficient performance. Trained on curated regional datasets, it supports multiple Indian-origin languages—including Tamil and Malayalam—alongside Arabic. This makes it a versatile option for a range of regional and multilingual applications. Read more at the blog post here
2025-02-17 32,768 text->text Mistral
Mistral: Pixtral Large 2411
$0.0080/1k
$0.024/1k
mistralai/pixtral-large-2411
Pixtral Large is a 124B parameter, open-weight, multimodal model built on top of Mistral Large 2. The model is able to understand documents, charts and natural images. The model is available under the Mistral Research License (MRL) for research and educational use, and the Mistral Commercial License for experimentation, testing, and production for commercial purposes.
2024-11-19 131,072 text+image->text Mistral
Mistral: Pixtral 12B
$0.0004/1k
$0.0004/1k
mistralai/pixtral-12b
The first multi-modal, text+image-to-text model from Mistral AI. Its weights were launched via torrent: https://x.com/mistralai/status/1833758285167722836.
2024-09-10 32,768 text+image->text Mistral
Mistral: Mixtral 8x7B Instruct
$0.0022/1k
$0.0022/1k
mistralai/mixtral-8x7b-instruct
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe
2023-12-10 32,768 text->text Mistral
mistralai/mixtral-8x22b-instruct
Mistral's official instruct fine-tuned version of Mixtral 8x22B. It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement here. moe
2024-04-17 65,536 text->text Mistral
Mistral: Mistral Small Creative
$0.0004/1k
$0.0012/1k
mistralai/mistral-small-creative
Mistral Small Creative is an experimental small model designed for creative writing, narrative generation, roleplay and character-driven dialogue, general-purpose instruction following, and conversational agents.
2025-12-17 32,768 text->text Mistral
Mistral: Mistral Small 3.2 24B
$0.0002/1k
$0.0007/1k
mistralai/mistral-small-3.2-24b-instruct
Mistral-Small-3.2-24B-Instruct-2506 is an updated 24B parameter model from Mistral optimized for instruction following, repetition reduction, and improved function calling. Compared to the 3.1 release, version 3.2 significantly improves accuracy on WildBench and Arena Hard, reduces infinite generations, and delivers gains in tool use and structured output tasks. It supports image and text inputs with structured outputs, function/tool calling, and strong performance across coding (HumanEval+, MBPP), STEM (MMLU, MATH, GPQA), and vision benchmarks (ChartQA, DocVQA).
2025-06-21 131,072 text+image->text Mistral
mistralai/mistral-small-3.1-24b-instruct:free
Mistral Small 3.1 24B Instruct is an upgraded variant of Mistral Small 3 (2501), featuring 24 billion parameters with advanced multimodal capabilities. It provides state-of-the-art performance in text-based reasoning and vision tasks, including image analysis, programming, mathematical reasoning, and multilingual support across dozens of languages. Equipped with an extensive 128k token context window and optimized for efficient local inference, it supports use cases such as conversational agents, function calling, long-document comprehension, and privacy-sensitive deployments. The updated version is Mistral Small 3.2
2025-03-18 128,000 text+image->text Mistral
Mistral: Mistral Small 3.1 24B
$0.0001/1k
$0.0004/1k
mistralai/mistral-small-3.1-24b-instruct
Mistral Small 3.1 24B Instruct is an upgraded variant of Mistral Small 3 (2501), featuring 24 billion parameters with advanced multimodal capabilities. It provides state-of-the-art performance in text-based reasoning and vision tasks, including image analysis, programming, mathematical reasoning, and multilingual support across dozens of languages. Equipped with an extensive 128k token context window and optimized for efficient local inference, it supports use cases such as conversational agents, function calling, long-document comprehension, and privacy-sensitive deployments. The updated version is Mistral Small 3.2
2025-03-18 131,072 text+image->text Mistral
Mistral: Mistral Small 3
$0.0001/1k
$0.0004/1k
mistralai/mistral-small-24b-instruct-2501
Mistral Small 3 is a 24B-parameter language model optimized for low-latency performance across common AI tasks. Released under the Apache 2.0 license, it features both pre-trained and instruction-tuned versions designed for efficient local deployment. The model achieves 81% accuracy on the MMLU benchmark and performs competitively with larger models like Llama 3.3 70B and Qwen 32B, while operating at three times the speed on equivalent hardware. Read the blog post about the model here.
2025-01-31 32,768 text->text Mistral