Introduction
In many network environments, direct access to AI service APIs such as OpenAI and Claude may be restricted. By properly configuring a proxy, you can ensure OpenClaw works correctly even in restricted networks. This article covers HTTP proxies, SOCKS5 proxies, and per-model proxy configuration in detail.
Configuring Proxies via Environment Variables
The simplest approach is to set a global proxy through environment variables. OpenClaw automatically reads the following standard environment variables:
| Variable | Description | Example |
|---|---|---|
HTTP_PROXY |
Proxy for HTTP requests | http://127.0.0.1:7890 |
HTTPS_PROXY |
Proxy for HTTPS requests | http://127.0.0.1:7890 |
ALL_PROXY |
Proxy for all requests (lowest priority) | socks5://127.0.0.1:1080 |
NO_PROXY |
Addresses that should bypass the proxy | localhost,127.0.0.1,.local |
Linux / macOS Setup
Export the environment variables before starting OpenClaw in the terminal:
export HTTP_PROXY="http://127.0.0.1:7890"
export HTTPS_PROXY="http://127.0.0.1:7890"
export NO_PROXY="localhost,127.0.0.1"
openclaw up
To make these settings persistent, add the variables to ~/.bashrc or ~/.zshrc:
echo 'export HTTP_PROXY="http://127.0.0.1:7890"' >> ~/.bashrc
echo 'export HTTPS_PROXY="http://127.0.0.1:7890"' >> ~/.bashrc
source ~/.bashrc
Windows Setup
In PowerShell:
$env:HTTP_PROXY = "http://127.0.0.1:7890"
$env:HTTPS_PROXY = "http://127.0.0.1:7890"
openclaw up
For persistent configuration, add these to your system environment variables.
Configuring Proxies via Configuration File
In addition to environment variables, you can set the proxy directly in the OpenClaw configuration file. Edit ~/.config/openclaw/openclaw.json5:
{
// Global proxy settings
proxy: {
// HTTP/HTTPS proxy address
url: "http://127.0.0.1:7890",
// Hostnames that should bypass the proxy
bypass: ["localhost", "127.0.0.1", "192.168.*"]
}
}
SOCKS5 Proxy Configuration
If you are using a SOCKS5 proxy (e.g., an SSH tunnel or Shadowsocks), configure it as follows:
{
proxy: {
url: "socks5://127.0.0.1:1080",
// If the proxy requires authentication
username: "your-username",
password: "your-password"
}
}
You can also use SOCKS5 via environment variables:
export ALL_PROXY="socks5://127.0.0.1:1080"
openclaw up
Per-Model Proxy Configuration
Different AI model APIs may require different proxy strategies. For example, you might want OpenAI traffic to go through an overseas proxy while keeping Ollama local traffic proxy-free. This can be configured as follows:
{
// Global default proxy
proxy: {
url: "http://127.0.0.1:7890"
},
models: {
openai: {
apiKey: "sk-xxxx",
// OpenAI uses the global proxy (inherits from above)
},
claude: {
apiKey: "sk-ant-xxxx",
// Claude uses a separate proxy
proxy: {
url: "http://127.0.0.1:8080"
}
},
ollama: {
baseUrl: "http://localhost:11434",
// Ollama is a local service; no proxy needed
proxy: {
url: null // Explicitly disable proxy
}
}
}
}
Proxy Priority
When multiple proxy configurations coexist, OpenClaw applies them in the following priority order:
- Model-level proxy — Highest priority; applies only to the specified model
- Config file global proxy — Second priority
- Environment variable proxy — Lowest priority
Proxy Authentication
If your proxy server requires username and password authentication, there are two ways to configure it:
Option 1: Credentials in the URL
{
proxy: {
url: "http://username:[email protected]:8080"
}
}
Option 2: Separate fields
{
proxy: {
url: "http://proxy.example.com:8080",
username: "your-username",
password: "your-password"
}
}
Option 2 is recommended to avoid URL parsing issues caused by special characters in passwords.
Testing Proxy Connectivity
After configuring the proxy, use the openclaw doctor command to verify network connectivity:
openclaw doctor
This command checks the following in sequence:
- Whether the proxy server is reachable
- Whether each AI model's API endpoint is accessible through the proxy
- Whether DNS resolution is working properly
- Whether TLS certificates are valid
Sample output:
[✓] Proxy server http://127.0.0.1:7890 is reachable
[✓] OpenAI API (api.openai.com) connected successfully (latency: 320ms)
[✓] Claude API (api.anthropic.com) connected successfully (latency: 280ms)
[✓] Ollama (localhost:11434) connected successfully (latency: 2ms)
[✓] All checks passed
Troubleshooting Common Connection Issues
Issue 1: Proxy Timeout
If you encounter ETIMEDOUT or ECONNREFUSED errors:
# First, verify the proxy service is running
curl -x http://127.0.0.1:7890 https://httpbin.org/ip
# Check if the port is listening
netstat -tlnp | grep 7890
Issue 2: SSL Certificate Errors
Some proxies replace TLS certificates, causing certificate validation failures:
{
proxy: {
url: "http://127.0.0.1:7890",
// Use only in development; not recommended for production
rejectUnauthorized: false
}
}
A more secure approach is to import the proxy's CA certificate:
export NODE_EXTRA_CA_CERTS="/path/to/proxy-ca.crt"
openclaw up
Issue 3: Corporate Networks Requiring a PAC File
OpenClaw does not directly support PAC files, but you can manually parse the PAC file rules and configure the corresponding proxy addresses. Alternatively, use a local proxy tool (such as Privoxy) to convert the PAC into a standard HTTP proxy.
Issue 4: Proxy Configuration in WSL
In WSL, you need to point the proxy address to the Windows host IP:
# Get the Windows host IP
WIN_HOST=$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}')
export HTTP_PROXY="http://${WIN_HOST}:7890"
export HTTPS_PROXY="http://${WIN_HOST}:7890"
openclaw up
Proxy Configuration in Docker
If you run OpenClaw in Docker, you can pass proxy variables through docker-compose.yml:
services:
openclaw:
image: openclaw/openclaw:latest
environment:
- HTTP_PROXY=http://host.docker.internal:7890
- HTTPS_PROXY=http://host.docker.internal:7890
- NO_PROXY=localhost,127.0.0.1,ollama
ports:
- "18789:18789"
Note that within Docker, you should use host.docker.internal to access the host machine's proxy service.
Summary
Proxy configuration is a critical step for using OpenClaw in restricted network environments. Recommended practices:
- Personal use: Setting environment variables is usually sufficient
- Multi-model setups: Configure per-model proxies in the configuration file
- Enterprise deployments: Combine Docker environment variables with config files for centralized management
- Troubleshooting: Use
openclaw doctorto quickly identify connectivity issues
After completing the configuration, remember to run openclaw restart to restart the service and apply the changes.