海量在线大模型 兼容OpenAI API

ModelsHub 模型仓库 API 文档

您还未登录,请登录后再查看API密钥!

快速开始

只需将您现有代码中的OpenAI API地址替换为我们的地址即可:

- https://api.openai.com/v1
+ https://www.models-hub.net/v1

Python示例

from openai import OpenAI

client = OpenAI(
    base_url="https://www.models-hub.net/v1",  # 替换为我们的API地址
    api_key="YOUR-API-KEY",  # 替换为您的API密钥
)

# 普通对话
completion = client.chat.completions.create(
    model="openai/gpt-3.5-turbo", messages=[{"role": "user", "content": "你好"}]
)

print(completion.choices[0].message.content)

# 流式对话
stream = client.chat.completions.create(
    model="openai/gpt-3.5-turbo",
    messages=[{"role": "user", "content": "你好"}],
    stream=True,  # 启用流式输出
)

for chunk in stream:
    if chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")

TypeScript示例

import OpenAI from 'openai';

const openai = new OpenAI({
    baseURL: 'https://www.models-hub.net/v1',  // 替换为我们的API地址
    apiKey: 'YOUR-API-KEY'  // 替换为您的API密钥
});

// 普通对话
async function chat() {
    const completion = await openai.chat.completions.create({
        model: "openai/gpt-3.5-turbo",
        messages: [
            { role: "user", content: "你好" }
        ]
    });
    console.log(completion.choices[0].message.content);
}

// 流式对话
async function streamChat() {
    const stream = await openai.chat.completions.create({
        model: "openai/gpt-3.5-turbo",
        messages: [
            { role: "user", content: "你好" }
        ],
        stream: true  // 启用流式输出
    });
    for await (const chunk of stream) {
        if (chunk.choices[0]?.delta?.content) {
            process.stdout.write(chunk.choices[0].delta.content);
        }
    }
}

完全兼容性说明

请求参数

参数 类型 必填 说明
messages array 对话消息列表
model string 模型ID
stream boolean 是否使用流式响应,默认false
max_tokens integer 返回的最大token数量