OpenClaw is an open-source, self-hosted personal AI assistant platform that connects messaging apps to AI agents running on your own hardware. It gives you an autonomous AI assistant without giving up control of your data.Documentation Index
Fetch the complete documentation index at: https://doc.hitopen.com/llms.txt
Use this file to discover all available pages before exploring further.
- Official homepage: https://openclaw.ai
- Documentation: https://docs.openclaw.ai
- GitHub: https://github.com/openclaw/openclaw
Core features
Multi-channel integration
Multi-channel integration
Supports Telegram, Discord, WhatsApp, iMessage, Feishu, and more via plugins. All channels are managed through a single Gateway process. Includes voice support for macOS, iOS, and Android, and can render interactive Canvas interfaces.
Self-hosting and data security
Self-hosting and data security
Runs entirely on your own machine or server under an MIT open-source license. Context and skills are stored locally — not in the cloud.
Smart agent capabilities
Smart agent capabilities
Supports persistent background operation with long-term memory, cron-based scheduled tasks, session isolation by agent/workspace/sender, multi-agent routing, and native tool calling with code execution.
Prerequisites
Before integrating with Newapi, ensure you have:- Node.js 22 or higher
- A Newapi address (usually ending with
/v1) - A Newapi API key
Install and start OpenClaw
Install OpenClaw (macOS/Linux)
Run the onboarding wizard
Verify Gateway and Control UI
Check the Gateway status:Open the Control UI in your browser:If the Control UI loads, OpenClaw is running correctly. You do not need to configure messaging channels (Telegram, Discord, Feishu, etc.) yet.
Locate the configuration file
OpenClaw’s configuration file is located at
~/.openclaw/openclaw.json. You will modify this file to add Newapi as a model provider.If you run OpenClaw under a dedicated service account or want to customize the configuration directory, use the environment variables
OPENCLAW_HOME, OPENCLAW_STATE_DIR, or OPENCLAW_CONFIG_PATH. See Environment Variables in the OpenClaw documentation for details.Configure Newapi as a model provider
OpenClaw supports OpenAI-compatible model gateways viamodels.providers. Add Newapi as a custom provider and point your default agent model at it.
Store your API key in an environment variable
.env file that OpenClaw reads, or to the service’s environment).
Add Newapi to openclaw.json
Add or merge the following into your ~/.openclaw/openclaw.json:
Key configuration fields
| Field | Description |
|---|---|
models.mode | Set to merge to add newapi while keeping OpenClaw’s built-in providers. |
models.providers.newapi.baseUrl | Your Newapi address. Must include /v1. |
models.providers.newapi.apiKey | Your Newapi key. Use ${NEWAPI_API_KEY} to inject from environment. |
models.providers.newapi.api | Use openai-completions for OpenAI-compatible gateways like Newapi. |
models.providers.newapi.models | Model IDs must match the actual model names in your Newapi instance. |
agents.defaults.model.primary | Default primary model. Format: provider/model-id. |
agents.defaults.model.fallbacks | Fallback models used automatically if the primary model fails. |
agents.defaults.models | Optional aliases for models, convenient for referencing in the UI or conversations. |
Verify the integration
Reopen the Control UI:newapi/ prefix:
newapi/..., the integration is working correctly.
Common issues
baseUrlmissing/v1— This is the most common integration error. Always include/v1at the end of your Newapi address.- Incorrect model ID — The
primaryandfallbacksvalues must exactly match theidfields inmodels.providers.newapi.models. - API key not available to the Gateway service — If the Gateway runs as a background daemon, ensure it can read
NEWAPI_API_KEYfrom its environment, not just your current terminal session. - Foreground debugging — To observe logs and errors directly, run the Gateway in the foreground:
openclaw gateway --port 18789.