Metadata-Version: 2.4
Name: llmframe
Version: 2.0.1
Summary: OpenAI-first hexagonal Python application scaffold for llmframe.
License-File: LICENSE
Requires-Python: >=3.11
Requires-Dist: httpx>=0.27
Requires-Dist: openai>=1.0
Requires-Dist: pydantic>=2.7
Provides-Extra: dev
Requires-Dist: mypy>=1.10; extra == 'dev'
Requires-Dist: pytest-cov>=6.0; extra == 'dev'
Requires-Dist: pytest>=8.0; extra == 'dev'
Requires-Dist: python-semantic-release>=10.5.0; extra == 'dev'
Requires-Dist: ruff>=0.4; extra == 'dev'
Description-Content-Type: text/markdown

# llmframe

OpenAI-first Python hexagonal application scaffold for the `llmframe` repository.

## Requirements

- Python 3.11+
- [`uv`](https://docs.astral.sh/uv/) for environment and dependency management

## Setup

Install the project and development dependencies:

```bash
uv sync --all-extras
```

## Project layout

The repository follows a hexagonal architecture layout:

```text
src/llmframe/
├── domain/          # Pure business logic and invariants
├── application/     # Use cases and port interfaces
│   └── ports/
└── adapters/        # Input/output adapters at the system boundary
    ├── input/
    └── output/
tests/
├── unit/
└── integration/
```

## LLM adapters

The repository includes reusable LLM output adapters under `llmframe.adapters.output.llm`.
Today this package is intentionally **OpenAI-first**: OpenAI is the only implemented provider integration, while the surrounding structure is being kept hexagonal so additional providers can be added later without leaking provider-specific concerns into the shared/application layers.

Key package areas:

- `llmframe.adapters.output.llm.llm_adapter` - provider-neutral high-level structured JSON and text generation adapter
- `llmframe.adapters.output.llm.providers.openai` - OpenAI provider adapter, client builder, transport, DTOs, and parsing helpers
- `llmframe.adapters.output.llm.usage_tracker` - aggregated token/cost tracking utilities

The sibling `providers/anthropic` and `providers/gemini` directories are architectural placeholders only; they are not wired integrations yet.

Example imports:

```python
from llmframe.adapters.output.llm import LlmAdapter
from llmframe.adapters.output.llm.providers.openai import OpenAIClientSettings, build_provider
from llmframe.adapters.output.llm.usage_tracker import LlmUsageTrackerConfig, OpenAILlmUsageTracker
```

## Quality checks

Run the local quality gate with `uv`:

```bash
uv run ruff format .
uv run ruff check .
uv run mypy .
uv run pytest
```

## Release publishing

GitHub Actions publishes this package to PyPI using the workflow at
`.github/workflows/ci_cd.yaml`.

- Publishing is triggered by pushes to the `master` branch or manual workflow dispatch.
- The workflow uses Python Semantic Release to determine the next version from conventional commits, create the release commit and tag, and generate changelog updates.
- `pyproject.toml` version updates are release-managed: do not bump `project.version` in regular feature or fix commits; Semantic Release writes the new version during the release commit.
- Package artifacts are built in CI and published to PyPI only.
- If no releasable commits are detected, the publish job is skipped.
- Publishing targets the `llmframe` project on `pypi.org`.

Before the workflow can publish successfully, configure a trusted publisher in
the PyPI project settings for this GitHub repository and workflow file.

Also ensure the workflow has permission to use the repository `GITHUB_TOKEN` for
creating release commits and tags.

Example release flow:

```bash
git commit -m "feat: add new llm transport option"
git push origin master
```

## Notes

- The import package name is `llmframe`.
- Keep business logic inside `domain/` and `application/`.
- Keep framework, transport, persistence, and integration concerns in `adapters/`.
