Metadata-Version: 2.4
Name: prgen-cli
Version: 0.3.0
Summary: Generate PR titles and descriptions from git diff
Author: Jean Paul Fernandez
License-Expression: GPL-3.0-only
Project-URL: Homepage, https://github.com/jpxoi/prgen
Project-URL: Repository, https://github.com/jpxoi/prgen
Project-URL: Issues, https://github.com/jpxoi/prgen/issues
Keywords: cli,git,pull-request,diff,llm,openai,gemini
Classifier: Development Status :: 4 - Beta
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Topic :: Software Development :: Version Control
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: google-genai>=1.70.0
Requires-Dist: ollama>=0.6.1
Requires-Dist: openai>=2.30.0
Requires-Dist: rich>=14.3.3
Requires-Dist: typer>=0.24.1
Dynamic: license-file

# prgen

Generate a pull request title and description from `git diff` and commit history.

![prgen CLI help](images/help.png)

**Author:** Jean Paul Fernandez · [github.com/jpxoi/prgen](https://github.com/jpxoi/prgen)

Licensed under the [GNU General Public License v3.0](LICENSE) (GPL-3.0-only).

## Requirements

- Python 3.10+
- `git` on your `PATH`
- One of:
  - OpenAI API access
  - Google Gemini API access
  - A local or remote Ollama server

## Install

From [PyPI](https://pypi.org/project/prgen-cli/):

```bash
pip install prgen-cli
# or
uv tool install prgen-cli
```

From a clone:

```bash
uv sync
```

Run the CLI with:

```bash
prgen --help
```

## What It Does

`prgen` compares `HEAD` against a base ref, collects:

- `git diff <base>...HEAD`
- `git log <base>..HEAD`

It sends that context to an LLM and prints:

- a PR title from `<summary>...</summary>`
- a PR description from `<body>...</body>`

If the model does not return those tags, prgen prints the raw model output instead.

## Providers

`prgen` supports three backends:

- `openai`
- `gemini`
- `ollama`

`--provider auto` is the default.

In `auto` mode:

- Gemini is chosen when `GOOGLE_API_KEY` is available
- otherwise OpenAI is chosen when `OPENAI_API_KEY` is available
- if neither key is configured, prgen exits with an error

Ollama is always explicit:

- use `--provider ollama`
- also pass `--model <name>`
- `--tier` presets do not apply to Ollama

## Quick Start

OpenAI:

```bash
prgen config set OPENAI_API_KEY sk-...
prgen
```

Gemini:

```bash
prgen config set GOOGLE_API_KEY your-key
prgen --provider gemini
```

Ollama:

```bash
prgen --provider ollama --model llama3.1:8b
```

If the Ollama model is missing locally, let prgen pull it:

```bash
prgen --provider ollama --model llama3.1:8b --pull
```

## Configuration

Configuration lives in `~/.config/prgen/config.json`.

If `XDG_CONFIG_HOME` is set, prgen uses:

```bash
$XDG_CONFIG_HOME/prgen/config.json
```

You can manage the file with:

```bash
prgen config
prgen config show
prgen config path
```

Supported persisted keys:

- `OPENAI_API_KEY`
- `GOOGLE_API_KEY`
- `OLLAMA_HOST`
- `base`
- `provider`
- `tier`

Notes:

- `OPENAI_API_KEY` and `GOOGLE_API_KEY` are treated as secrets
- `OLLAMA_HOST` is not secret and is merged into the environment if set
- `base`, `provider`, and `tier` are optional CLI defaults
- `model` and `context` are not persisted in config

Examples:

```bash
prgen config
prgen config set OPENAI_API_KEY sk-...
prgen config set GOOGLE_API_KEY your-key
prgen config set OLLAMA_HOST http://127.0.0.1:11434
prgen config set base origin/main
prgen config set provider ollama
prgen config set tier pro
prgen config unset OLLAMA_HOST
prgen config show
```

To read a secret value from stdin:

```bash
prgen config set OPENAI_API_KEY - < key.txt
```

## Defaults

Built-in defaults:

| Option | Default | Notes |
| --- | --- | --- |
| `--repo`, `-C` | current directory | Uses the current git repo unless you point elsewhere |
| `--base` | `origin/main` | The ref must resolve locally |
| `--provider` | `auto` | Prefers Gemini over OpenAI when both keys exist |
| `--tier` | `default` | Used only for OpenAI and Gemini |
| `--model` | unset | Overrides tier selection; required for Ollama |
| `--context` | `none` | Extra text merged into the prompt |
| `--pull` | `false` | Only relevant for Ollama |

Config-file defaults apply only when you omit the matching flag:

- `base`
- `provider`
- `tier`

Explicit flags always win over the config file.

Current model presets:

- OpenAI `default`: `gpt-5-mini`
- OpenAI `pro`: `gpt-5.4`
- Gemini `default`: `gemini-3-flash-preview`
- Gemini `pro`: `gemini-3.1-pro-preview`

## Usage

Basic usage:

```bash
prgen
```

Pick a different base:

```bash
prgen --base main
```

Run against another repository:

```bash
prgen -C ~/src/my-project
```

Override the model directly:

```bash
prgen --provider openai --model gpt-5.4
prgen --provider gemini --model gemini-3.1-pro-preview
prgen --provider ollama --model mistral-small3.1
```

Add extra context:

```bash
prgen --context "Focus on customer-facing impact and rollout notes."
```

## Behavior Notes

- prgen validates that `--base` resolves before generating anything
- if there are no commits and no file changes vs the base ref, prgen exits with an error
- when `--provider ollama --pull` is used, prgen can download the model automatically
- when stderr is a TTY, loading states and Ollama downloads use Rich UI output

## Development

Install local dependencies:

```bash
uv sync
```

Format and lint:

```bash
uv run ruff format .
uv run ruff check .
```

Run from the repo without installing globally:

```bash
uv run prgen --help
```

Install the local checkout as a global tool:

```bash
uv tool install .
# or
pipx install .
```
