Skip to main content

Documentation Index

Fetch the complete documentation index at: https://doc.hitopen.com/llms.txt

Use this file to discover all available pages before exploring further.

OpenClaw is an open-source, self-hosted personal AI assistant platform that connects messaging apps to AI agents running on your own hardware. It gives you an autonomous AI assistant without giving up control of your data.

Core features

Supports Telegram, Discord, WhatsApp, iMessage, Feishu, and more via plugins. All channels are managed through a single Gateway process. Includes voice support for macOS, iOS, and Android, and can render interactive Canvas interfaces.
Runs entirely on your own machine or server under an MIT open-source license. Context and skills are stored locally — not in the cloud.
Supports persistent background operation with long-term memory, cron-based scheduled tasks, session isolation by agent/workspace/sender, multi-agent routing, and native tool calling with code execution.

Prerequisites

Before integrating with Newapi, ensure you have:
  • Node.js 22 or higher
  • A Newapi address (usually ending with /v1)
  • A Newapi API key
It is recommended to get the OpenClaw Gateway and Control UI running first before configuring Newapi. This makes it easier to distinguish between OpenClaw startup issues and model provider configuration errors during troubleshooting.

Install and start OpenClaw

1

Install OpenClaw (macOS/Linux)

curl -fsSL https://openclaw.ai/install.sh | bash
For other installation methods, refer to Getting Started in the OpenClaw documentation.
2

Run the onboarding wizard

openclaw onboard --install-daemon
This wizard handles basic authentication, Gateway setup, and optional channel initialization. The goal at this stage is to get OpenClaw running. You will switch to Newapi as the model provider in the next section.
3

Verify Gateway and Control UI

Check the Gateway status:
openclaw gateway status
Open the Control UI in your browser:
openclaw dashboard
If the Control UI loads, OpenClaw is running correctly. You do not need to configure messaging channels (Telegram, Discord, Feishu, etc.) yet.
4

Locate the configuration file

OpenClaw’s configuration file is located at ~/.openclaw/openclaw.json. You will modify this file to add Newapi as a model provider.
If you run OpenClaw under a dedicated service account or want to customize the configuration directory, use the environment variables OPENCLAW_HOME, OPENCLAW_STATE_DIR, or OPENCLAW_CONFIG_PATH. See Environment Variables in the OpenClaw documentation for details.

Configure Newapi as a model provider

OpenClaw supports OpenAI-compatible model gateways via models.providers. Add Newapi as a custom provider and point your default agent model at it.

Store your API key in an environment variable

export NEWAPI_API_KEY="sk-your-newapi-key"
For a background service, ensure the service process can also read this variable (e.g., by adding it to a .env file that OpenClaw reads, or to the service’s environment).

Add Newapi to openclaw.json

Add or merge the following into your ~/.openclaw/openclaw.json:
{
  "models": {
    "mode": "merge",
    "providers": {
      "newapi": {
        "baseUrl": "https://<your-newapi-domain>/v1",
        "apiKey": "${NEWAPI_API_KEY}",
        "api": "openai-completions",
        "models": [
          { "id": "gemini-2.5-flash", "name": "Gemini 2.5 Flash" },
          { "id": "kimi-k2.5", "name": "Kimi K2.5" }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "newapi/gemini-2.5-flash",
        "fallbacks": ["newapi/kimi-k2.5"]
      },
      "models": {
        "newapi/gemini-2.5-flash": { "alias": "flash" },
        "newapi/kimi-k2.5": { "alias": "kimi" }
      }
    }
  }
}
Replace the model IDs with the actual model names exposed by your Newapi instance.

Key configuration fields

FieldDescription
models.modeSet to merge to add newapi while keeping OpenClaw’s built-in providers.
models.providers.newapi.baseUrlYour Newapi address. Must include /v1.
models.providers.newapi.apiKeyYour Newapi key. Use ${NEWAPI_API_KEY} to inject from environment.
models.providers.newapi.apiUse openai-completions for OpenAI-compatible gateways like Newapi.
models.providers.newapi.modelsModel IDs must match the actual model names in your Newapi instance.
agents.defaults.model.primaryDefault primary model. Format: provider/model-id.
agents.defaults.model.fallbacksFallback models used automatically if the primary model fails.
agents.defaults.modelsOptional aliases for models, convenient for referencing in the UI or conversations.

Verify the integration

Reopen the Control UI:
openclaw dashboard
Confirm that the available models include the newapi/ prefix:
openclaw models list
If you can start conversations in OpenClaw and the default model shows as newapi/..., the integration is working correctly.

Common issues

  • baseUrl missing /v1 — This is the most common integration error. Always include /v1 at the end of your Newapi address.
  • Incorrect model ID — The primary and fallbacks values must exactly match the id fields in models.providers.newapi.models.
  • API key not available to the Gateway service — If the Gateway runs as a background daemon, ensure it can read NEWAPI_API_KEY from its environment, not just your current terminal session.
  • Foreground debugging — To observe logs and errors directly, run the Gateway in the foreground: openclaw gateway --port 18789.