Metadata-Version: 2.4
Name: ai-agent-proxy
Version: 0.1.0
Summary: Turn your agent CLI into an OpenAI-like service with chat and responses endpoints
Author-email: Leo <leoustc@icloud.com>
Requires-Python: >=3.8
Description-Content-Type: text/markdown
Requires-Dist: fastapi>=0.110
Requires-Dist: uvicorn>=0.23

# AI Agent Proxy

`ai-agent-proxy` is a Python package that turns your agent CLI into an OpenAI-like HTTP service.

It exposes both:

- the traditional chat endpoint: `POST /v1/chat/completions`
- the Responses API endpoint: `POST /v1/responses`

It also includes:

- the `ai-agent-proxy` CLI
- a FastAPI server with a simple browser chat at `GET /web`
- persistent local agent workspaces
- automatic startup of the `manager` agent

## What It Does

`ai-agent-proxy` runs your local agent backend behind an API shape that OpenAI-style clients can use.

You can use it to:

- point chatbot clients at an OpenAI-compatible URL
- expose a manager-backed assistant to Mattermost, Slack, or browser clients
- keep agent state in persistent local workspaces under `~/.ai_agent_proxy`
- run a simple manager-first service without changing your core CLI workflow

## Endpoints

The service supports:

- `GET /`
- `GET /web`
- `GET /health`
- `POST /v1/chat/completions`
- `POST /v1/responses`
- `POST /agent-reply`

`GET /` redirects to `/web`.

## Installation

From source:

```bash
pip install .
```

For local development:

```bash
make install
```

## CLI

Main commands:

- `ai-agent-proxy enable`
- `ai-agent-proxy restart`
- `ai-agent-proxy init <role>`
- `ai-agent-proxy list`
- `ai-agent-proxy status`
- `ai-agent-proxy connect <agent> [working_dir]`
- `ai-agent-proxy stop <role>`

The service command is:

```bash
ai-agent-proxy daemon
```

## Start The Service

Run from source:

```bash
make debug
```

Install and enable as a system service:

```bash
ai-agent-proxy enable --ip 0.0.0.0 --port 7011
ai-agent-proxy restart
```

By default the service starts the `manager` agent automatically.

## Authentication

If `AI_AGENT_PROXY_API_KEY` is set, API requests must send:

```http
Authorization: Bearer <your-key>
```

## Traditional Endpoint

OpenAI-compatible chat completions:

```bash
curl http://127.0.0.1:7011/v1/chat/completions \
  -H 'Authorization: Bearer YOUR_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "model": "manager",
    "messages": [
      {"role": "user", "content": "hello"}
    ]
  }'
```

## Responses Endpoint

OpenAI Responses API style:

```bash
curl http://127.0.0.1:7011/v1/responses \
  -H 'Authorization: Bearer YOUR_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "model": "manager",
    "input": "hello",
    "stream": false
  }'
```

## Browser UI

Open:

```text
http://127.0.0.1:7011/web
```

This UI is a simple chatbot routed to the `manager` agent.

## Workspace Layout

Agent state is stored under:

```text
~/.ai_agent_proxy/
```

Example:

```text
~/.ai_agent_proxy/manager/
~/.ai_agent_proxy/engineer_alice/
```

Each workspace can include role instructions, memory, notes, control files, and logs.

## Notes

- the Python import package is `ai_agent_proxy`
- the CLI command is `ai-agent-proxy`
- the default systemd unit name is `ai-agent-proxy.service`
- the API is manager-first: inbound requests are routed through the `manager` agent
