English

OpenClaw Access Doubao Model & Telegram Bot Configuration Instructions

1. Environment and premise

  • How OpenClaw works: Docker (gateway container)
  • Large model platform: Volcano Ark (Ark / Doubao)
  • Access method: OpenAI Chat Completions compatible interface
  • Communication channel: Telegram Bot (private chat)

2. Summary of the background and root causes of the problem

1️⃣ Initial problem phenomenon

  • Telegram Bot can receive messages, but returns an error:404 The model or endpoint ... does not exist or you do not have access to it
  • The call fails directly with doubao-lite-4k-240328 the model id

2️⃣ Root cause (two key points)

(1) Beanbag model ID ≠ can call the inference entrance

  • /modelsThe doubao-lite-4k-240328 returned Equal Model ID
  • Actual inference must use the Endpoint ID (ep-xxxx).
  • Endpoint → not created/not used is always a 404

(2) OpenClaw API type is misconfigured

  • Early uses:"api": "openai-completions"
  • But the bean bag is a chat model, which must go:POST /chat/completions

3. Correct verification path (key)

1️⃣ Direct curl authentication with Endpoint ID (successful)

curl -sS https://ark.cn-beijing.volces.com/api/v3/chat/completions \
  -H "Authorization: Bearer <ARK_API_KEY>" \
  -H "Content-Type: application/json" \
  -d '{
    "model":"ep-20251031171004-t2qcf",
    "messages":[{"role":"user","content":"ping"}]
  }'

Normal content returned ⇒ endpoint + region + key are correct

4. OpenClaw configuration modification (core)

File Path (Host):

~/.openclaw/openclaw.json

There are a total of 3 places that need to be changed , and they must be consistent

① models.providers.doubao.models[].id

From the Model ID → the Endpoint ID

"models": [
  {
    "id": "ep-20251031171004-t2qcf",
    "name": "Doubao",
    "reasoning": false,
    "input": ["text"],
    "contextWindow": 128000,
    "maxTokens": 8192
  }
]

And confirm:

"api": "openai-completions"

② agents.defaults.model.primary

"primary": "doubao/ep-20251031171004-t2qcf"

(3) The key of agents.defaults.models

"models": {
  "doubao/ep-20251031171004-t2qcf": {
    "alias": "Doubao"
  }
}

✅ Modified full logic

  • OpenClaw internal unified references:doubao/ep-20251031171004-t2qcf
  • Actual request path:/chat/completions
  • Ark backend: endpoint → underlying model (e.g. doubao-seed-1-6-lite-251015)

5. Telegram Bot configuration instructions (confirmed to be valid)

"channels": {
  "telegram": {
    "enabled": true,
    "botToken": "<YOUR_BOT_TOKEN>",
    "dmPolicy": "pairing",
    "allowFrom": [8434826952],
    "groupPolicy": "disabled",
    "streamMode": "partial"
  }
}

Key field descriptions

  • enabled: true: must be explicitly turned on
  • dmPolicy: pairing: Private chat needs to be paired (secure)
  • allowFrom: Allowed User ID (Whitelist)
  • groupPolicy: disabled: Disable groups to avoid accidental triggering
  • streamMode: partial: Streaming output (recommended)

6. Container and effective method

1️⃣ Verify JSON

jq . ~/.openclaw/openclaw.json

2️⃣ Restart the gateway

docker restart openclaw-openclaw-gateway-1

3️⃣ Telegram Private Chat Bot Test

  • It can be replied normally⇒ The whole link takes effect

7. Summary of important experience (pit avoidance)

  1. The bean bag must use endpoint(ep-xxxx) and cannot use the model id directly
  2. OpenClaw must be usedopenai-chat-completions
  3. /models≠ can be inferred
  4. Region must be consistent (endpoint in Beijing, use ark.cn-beijing)
  5. agents.defaults.model.primarymodels.providers and must be consistent, otherwise the implicit 404

8. Current status

  • ✅ The large model reasoning is normal
  • ✅ Telegram Bot conversations are working fine
  • ✅ OpenClaw configuration is stable and reusable
Scroll to Top