Skip to content
广告 · 本站推荐广告

OpenAI Chat Completions API

Gateway 提供与 OpenAI Chat Completions API 兼容的 HTTP 端点。任何支持 OpenAI API 的客户端库都可以直接连接 Gateway。

端点

text
POST /v1/chat/completions

完整 URL:

text
http://127.0.0.1:18789/v1/chat/completions   (本地)
https://gateway.example.com/v1/chat/completions (远程)

认证

通过 Authorization 头传递 Gateway Token:

bash
curl -X POST http://127.0.0.1:18789/v1/chat/completions \
  -H "Authorization: Bearer $OPENCLAW_GATEWAY_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"model":"gpt-4o","messages":[{"role":"user","content":"Hello"}]}'

兼容性

使用 Gateway Token 替代 OpenAI API Key。Gateway 会自动路由到正确的 Channel 和模型。

请求格式

json
{
  "model": "gpt-4o",
  "messages": [
    {
      "role": "system",
      "content": "You are a helpful assistant."
    },
    {
      "role": "user",
      "content": "What is the capital of France?"
    }
  ],
  "temperature": 0.7,
  "max_tokens": 1024,
  "stream": false,
  "tools": []
}

请求参数

参数类型必填说明
modelstring模型标识(由 Gateway 路由到对应 Channel)
messagesarray对话消息数组
temperaturenumber温度参数,0~2,默认 1
max_tokensnumber最大生成 Token 数
streamboolean是否启用流式响应,默认 false
toolsarray可用工具定义
tool_choicestring/object工具选择策略
top_pnumber核采样参数
frequency_penaltynumber频率惩罚
presence_penaltynumber存在惩罚

响应格式

非流式响应

json
{
  "id": "chatcmpl-abc123",
  "object": "chat.completion",
  "created": 1705312200,
  "model": "gpt-4o",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "The capital of France is Paris."
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 25,
    "completion_tokens": 8,
    "total_tokens": 33
  }
}

流式响应

设置 "stream": true 启用 Server-Sent Events(SSE,服务端推送事件)流式响应:

bash
curl -X POST http://127.0.0.1:18789/v1/chat/completions \
  -H "Authorization: Bearer $OPENCLAW_GATEWAY_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"model":"gpt-4o","messages":[{"role":"user","content":"Hello"}],"stream":true}'

响应流:

text
data: {"id":"chatcmpl-abc","choices":[{"delta":{"role":"assistant"},"index":0}]}

data: {"id":"chatcmpl-abc","choices":[{"delta":{"content":"The"},"index":0}]}

data: {"id":"chatcmpl-abc","choices":[{"delta":{"content":" capital"},"index":0}]}

data: {"id":"chatcmpl-abc","choices":[{"delta":{"content":" is"},"index":0}]}

data: [DONE]

工具调用

定义工具

json
{
  "model": "gpt-4o",
  "messages": [
    {"role": "user", "content": "What's the weather in Beijing?"}
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_weather",
        "description": "Get current weather for a location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {
              "type": "string",
              "description": "City name"
            }
          },
          "required": ["location"]
        }
      }
    }
  ]
}

工具调用响应

json
{
  "choices": [
    {
      "message": {
        "role": "assistant",
        "tool_calls": [
          {
            "id": "call_abc123",
            "type": "function",
            "function": {
              "name": "get_weather",
              "arguments": "{\"location\": \"Beijing\"}"
            }
          }
        ]
      },
      "finish_reason": "tool_calls"
    }
  ]
}

多模型路由

Gateway 根据 model 字段自动路由到对应的 Channel:

bash
# 路由到 OpenAI Channel
curl -d '{"model":"gpt-4o","messages":[...]}'

# 路由到 Anthropic Channel
curl -d '{"model":"claude-sonnet-4-20250514","messages":[...]}'

# 路由到本地 Ollama
curl -d '{"model":"llama3","messages":[...]}'

统一入口

不同提供商的模型通过同一个端点访问。Gateway 自动处理 API 格式差异。

客户端库集成

python
from openai import OpenAI

client = OpenAI(
    base_url="http://127.0.0.1:18789/v1",
    api_key="your-gateway-token"
)

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello"}]
)
print(response.choices[0].message.content)
javascript
import OpenAI from 'openai';

const client = new OpenAI({
  baseURL: 'http://127.0.0.1:18789/v1',
  apiKey: 'your-gateway-token',
});

const response = await client.chat.completions.create({
  model: 'gpt-4o',
  messages: [{ role: 'user', content: 'Hello' }],
});
console.log(response.choices[0].message.content);

相关文档

基于MIT协议开源 | 内容翻译自 官方文档,同步更新