Metadata-Version: 2.3
Name: augint-opencodex
Version: 0.6.1
Summary: Render local Codex and OpenCode repo config from .ai-codex.json
Author: Augmenting Integrations
Requires-Dist: click>=8.1.0
Requires-Dist: pydantic>=2.11.0
Requires-Python: >=3.12
Description-Content-Type: text/markdown

# augint-opencodex

`augint-opencodex` is a Python tool that renders local Codex and OpenCode project
configuration from a single tracked manifest, `.ai-codex.json`.

The first working slice is implemented here. It ships a real `ai-codex` CLI with:

- `ai-codex sync` to read `.ai-codex.json` and render `.ai-opencodex.md`,
  `.codex/config.toml`, `opencode.json`, and shared skills
- `ai-codex doctor` to inspect manifest resolution, generated files, local ignore setup,
  and staged generated artifacts
- a first-pass profile model with `augint` and `gov`
- local-only ignore handling through `.git/info/exclude`

## Installation

For local dev:

```bash
uv sync --group dev
```

Once the package is published, the intended install flows are:

```bash
uvx --from augint-opencodex ai-codex sync
uv tool install augint-opencodex
ai-codex sync
```

## Manifest

This tool expects a tracked `.ai-codex.json` file in the target repository.
If it is missing when you run `ai-codex sync`, the command will prompt for a
profile and create a minimal manifest before rendering. Pass `--profile augint`
or `--profile gov` to skip the prompt (useful for scripts). `--check` still
fails when the manifest is missing so CI does not silently initialize repos.

```json
{
  "version": 1,
  "profile": "augint",
  "references": ["./ai-lls-lib"],
  "blocked_paths": [
    "**/secrets/**",
    "**/*.pem",
    "**/terraform.tfstate*"
  ],
  "content_policy": {
    "no_emojis": true,
    "no_ai_mentions": true
  },
  "shell_guardrails": {
    "ask": ["aws *", "terraform *", "kubectl *", "git push *"],
    "deny": ["aws iam create*", "aws iam put*"]
  },
  "patterns": {
    "org_python_library": true
  },
  "opencode": {
    "enabled": true,
    "default_model": "qwen3-coder",
    "local_provider": {
      "kind": "openai-compatible",
      "name": "ollama",
      "base_url": "http://host.docker.internal:11434/v1"
    },
    "models": [
      { "id": "qwen3-coder", "name": "Qwen3 Coder (local)" }
    ],
    "bedrock": {
      "enabled": false,
      "models": []
    }
  },
  "codex": {
    "provider": "openai",
    "model": null,
    "approval_policy": null,
    "sandbox_mode": null,
    "web_search": null
  }
}
```

The current schema lives in [`schemas/ai-codex.schema.json`](schemas/ai-codex.schema.json).

### OpenCode and Codex manifest fields

The `opencode` and `codex` sections let the manifest fully express non-secret
tool configuration instead of deferring those details to a separate launcher:

- `opencode.local_provider` — OpenAI-compatible local endpoint (e.g. Ollama)
  rendered as `provider.<name>.options.baseURL` in `opencode.json`.
- `opencode.default_model` — top-level `model` in `opencode.json`.
- `opencode.models` — curated catalog attached to the local provider.
- `opencode.bedrock.enabled` / `opencode.bedrock.models` — toggles the
  `amazon-bedrock` provider entry with the listed model IDs.
- `codex.provider` — `openai` or `aws`. Emitted as a comment in
  `.codex/config.toml` so the runtime launcher knows which auth path to
  select. Raw secrets are never written to generated files.
- `codex.model`, `codex.approval_policy`, `codex.sandbox_mode`,
  `codex.web_search` — override the defaults carried by the active profile.

## Commands

Render files into the current repository:

```bash
uv run ai-codex sync
```

Preview pending changes without writing:

```bash
uv run ai-codex sync --dry-run
```

Fail if the repo is out of sync:

```bash
uv run ai-codex sync --check
```

Inspect the current repo state:

```bash
uv run ai-codex doctor
```

## Generated Files

The first slice writes:

- `.ai-opencodex.md`
- `.codex/config.toml`
- `opencode.json`
- `.agents/skills/README.md`
- `.agents/skills/org-python-tooling/SKILL.md` when
  `patterns.org_python_library` is enabled

Generated outputs are added to `.git/info/exclude` by default so target repositories do not need
to commit them.

## Canonical Ownership

There is one source of truth for each generated file, and one tool responsible
for each runtime concern. This split keeps `.ai-shell.yaml` from drifting out
of sync with generated config.

### File ownership

| File                   | Owner              | Notes                                   |
| ---------------------- | ------------------ | --------------------------------------- |
| `.ai-codex.json`       | user (tracked)     | Source of truth for all generated files |
| `opencode.json`        | `augint-opencodex` | Repo root; generated                    |
| `.codex/config.toml`   | `augint-opencodex` | Repo root; generated                    |
| `.ai-opencodex.md`     | `augint-opencodex` | Repo root; generated                    |
| `.agents/skills/**`    | `augint-opencodex` | Generated                               |

Generated files MUST NOT be hand-edited. Re-run `ai-codex sync` after manifest
changes.

### Runtime ownership (augint-shell)

`augint-shell` is responsible only for runtime and container behavior:

- bind-mounting `.codex/` and `opencode.json` into the container
- injecting secrets (`OPENAI_API_KEY`, AWS credentials) at launch time
- selecting Bedrock vs OpenAI auth paths based on the manifest's
  `codex.provider` / `opencode.bedrock.enabled` signals
- CLI launch flags (`ai-shell opencode`, `ai-shell codex`)

`augint-shell` should not ship template copies of `opencode.json` or
`.codex/config.toml`, and `.ai-shell.yaml` should stop documenting
`[opencode]` / `[codex]` config shape that duplicates `.ai-codex.json`.

### Consumption inside ai-shell containers

The generated files live at repo root in the host working directory. Inside an
`ai-shell` container, they are consumed as follows:

- `opencode.json` — the `ai-shell opencode` command launches OpenCode with the
  repo's working directory as CWD, so OpenCode discovers `opencode.json`
  directly. `ai-shell opencode --provider local|aws` selects which provider
  block from the generated `opencode.json` is active at launch and injects the
  matching credentials or endpoint env vars. It does not rewrite the file.
- `.codex/config.toml` — `ai-shell` bind-mounts the host `.codex/` directory
  into the container. Codex is launched with `CODEX_HOME=$(pwd)/.codex` so the
  generated file serves as the Codex home config. `ai-shell codex --provider
  openai|aws` mirrors `codex.provider` from the manifest and sets the
  corresponding auth env vars (`OPENAI_API_KEY`, AWS credential chain). Secrets
  are never written into `.codex/config.toml` or `opencode.json`.
- `.ai-opencodex.md` — Codex discovers it through the
  `project_doc_fallback_filenames` entry in the generated `.codex/config.toml`;
  OpenCode reads it through the `instructions` block in `opencode.json`.

If a repo uses `augint-opencodex` but is launched outside `ai-shell`, the same
files are still consumed by running `codex` / `opencode` directly from the
repo root with `CODEX_HOME=$(pwd)/.codex`.

### augint-shell migration

Once a repo adopts `augint-opencodex`, the following can be removed from
`augint-shell`:

- `src/ai_shell/templates/opencode/opencode.json`
- `src/ai_shell/templates/codex/config.toml`
- `[opencode]` and `[codex]` config blocks in the scaffolded
  `.ai-shell.yaml` template (beyond runtime toggles like `provider`)
- `codex_openai_api_key` handling in `config.py` moves to pure runtime env
  injection

Runtime-only fields that remain in `.ai-shell.yaml`:

- `opencode.provider` (`local|aws`) — chooses which provider block from
  `opencode.json` the launcher should activate
- Secret material passed via environment variables

## Dogfooding This Repo

This repository is set up to dogfood the generated instructions flow without a root `AGENTS.md`.

1. Keep `.ai-codex.json` tracked in the repo root.
2. Run `uv run ai-codex sync` to generate `.ai-opencodex.md`, `.codex/config.toml`, and the other
   local-only artifacts.
3. Start Codex with `CODEX_HOME=$(pwd)/.codex codex` so Codex uses the generated
   `.codex/config.toml` as its home config and discovers `.ai-opencodex.md` via
   `project_doc_fallback_filenames`.

Avoid creating a root `AGENTS.md` here. Codex checks `AGENTS.md` before fallback filenames in the
same directory, so a root `AGENTS.md` would shadow `.ai-opencodex.md` and split Codex from the
generated OpenCode instructions.

## Organizational Python Standard

This project uses `ai-lls-lib/` in the planning repo as the concrete reference for the
organization-wide Python package and tooling standard:

- `uv`-first packaging and development workflow
- `src/` layout and console scripts from `[project.scripts]`
- `ruff`, `mypy`, `pytest`, and `pre-commit`
- security and compliance checks in CI
- Conventional Commit and semantic-release-compatible versioning
- a stable Makefile task surface

## Development

```bash
make install
make test
make format
make typecheck
make build
```
