Mailbox-backed AI agent proxy server
Project description
ai-agent-proxy
ai-agent-proxy is a small FastAPI service that accepts OpenAI-style requests,
optionally relays them to a backend LLM, and writes inbox JSON for a local
agent runtime to handle.
This package exposes one CLI:
ai-agent-proxyfor the HTTP server
It is designed for a local workspace flow:
- the API receives a request
- the request JSON is written into the agent inbox
- your local agent runtime handles it
Features
- enqueue-only quick-release HTTP flow
- optional backend LLM relay with backend reply capture
- local inbox JSON output for agent runtimes
context_idextraction and compact follow-up inbox payloads- better OpenClaw responsiveness while long agent work is still running
- batch-friendly message flow for ongoing conversations
- pure local runtime design, so you can use the agent CLI you prefer
Yes / No
Yes:
- this package is for the OpenClaw OpenAI-compatible backend API flow
- it accepts OpenAI-style chat requests and turns them into local inbox work for the agent, which helps OpenClaw handle incoming messages more responsively
- when new messages arrive while the agent is already working, the agent can learn about the later messages before replying to the original one, if time permits
- it can forward requests to a backend LLM and still write the enriched request into the inbox
- the agent is expected to use tools and skills to send the real reply outward
No:
- this is not a general OpenAI endpoint replacement
- this endpoint does not return the final assistant result in the HTTP response
- it is not intended for clients that expect normal synchronous OpenAI chat completion behavior
Install
pip install ai-agent-proxy
After install, the main commands are:
ai-agent-proxy
Config
Proxy default config file: ./.ai-agent-proxy.conf
HOST=0.0.0.0
PORT=7011
KEY=aibot_<your-key-something>
WORKER=2
INBOX=~/.openclaw/workspace/inbox
URL=https://backend-llm.example.com/v1
API_KEY=sk_<your-backend-api-key>
API
This is an enqueue-only, quick-release API design intended to improve OpenClaw message handling throughput and give the agent room for batch processing.
POST /v1/chat/completionsGET /v1/modelsGET /v1/models/{model_id}
For request endpoints, the server writes the request body into the local inbox and returns an immediate reply to release the HTTP connection.
How It Works
Important local workspace paths:
- inbox:
<workspace>/inbox
Inbox files are raw JSON only. There is no extra mailbox wrapper.
On chat requests:
- if no backend LLM is configured, the proxy writes the request JSON into the inbox and returns an empty success response
- if a backend LLM is configured, the proxy forwards the request, returns the backend response to the client, and also writes the request into the inbox
- when a backend reply is available, the inbox JSON can include
backend-llm-reply - when a
context_idis found, repeated messages may be compacted for lighter inbox traffic
A separate local agent runtime can watch the inbox and handle the work.
Skills
Project skills live in ai_agent_proxy/skills/.
They describe:
- the proxy workflow
- outbound message delivery
- context handling
Logs
Useful server log lines:
chat_completion request inbox file_size=... inbox_path=...backend_forward_failed ...invalid_json ...
Any agent-side logging depends on the runtime you choose.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ai_agent_proxy-2.3.3.tar.gz.
File metadata
- Download URL: ai_agent_proxy-2.3.3.tar.gz
- Upload date:
- Size: 15.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
34a40847ed8681e3cf20da3392d13261bc1f44c9d5af33477245289a89a8a041
|
|
| MD5 |
08b8aaf01780034558ae437d64267462
|
|
| BLAKE2b-256 |
9d2b15706f484488cd2f95c6d4412c436403c59df192f63e28d60bfbde57db94
|
File details
Details for the file ai_agent_proxy-2.3.3-py3-none-any.whl.
File metadata
- Download URL: ai_agent_proxy-2.3.3-py3-none-any.whl
- Upload date:
- Size: 15.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8cbe97daf3f62514f86d875261e36042fb08ae156d27923cbb14ac5a949d1a7c
|
|
| MD5 |
c0f0bdb465e4507cd800e01fb3e8f70d
|
|
| BLAKE2b-256 |
8a88d703be61fd09f5ca47bb349e59c5e0c61d675a106287902c29803d089f62
|