Metadata-Version: 2.4
Name: tyler-python-helpers
Version: 0.1.3
Summary: A collection of helper functions
Project-URL: Homepage, https://github.com/tylerjwoodfin/python-helpers
Project-URL: Bug Tracker, https://github.com/tylerjwoodfin/python-helpers/issues
Project-URL: Documentation, https://github.com/tylerjwoodfin/python-helpers#readme
Author-email: Tyler Woodfin <feedback-python-helpers@tylerwoodfin.com>
Requires-Python: >=3.10
Requires-Dist: cabinet
Requires-Dist: openai
Description-Content-Type: text/markdown

# Python Helpers

A collection of helper utils for my personal projects.

## Installation

PyPI:

```bash
pip install tyler-python-helpers
```

**Working from a git checkout** (so scripts can use plain `from tyler_python_helpers import ChatGPT` with no `sys.path` hacks): install in **editable** mode once per environment:

```bash
pip install -e /path/to/python-helpers
# e.g. pip install -e ~/git/python-helpers
```

That registers the package with the interpreter the same way a normal install does. Layout under `tyler_python_helpers/` (including the `ai/` subpackage) is already standard; Python does not auto-import sibling folders next to your `tools/` repo.

Alternatively, set `PYTHONPATH` to the checkout root if you prefer not to install.

If **`ImportError: cannot import name 'local_ai'`** while the repo already defines it, check for a **stale user install** that only had the old flat package (e.g. `~/.local/lib/python3.12/site-packages/tyler_python_helpers/`). Remove it and point the interpreter at this repo, for example:

```bash
python3 -m pip uninstall tyler-python-helpers -y --break-system-packages
python3 -m pip install --user -e /path/to/python-helpers --break-system-packages
```

Or set **`PYTHONPATH`** to the checkout root.

## AI (`tyler_python_helpers.ai`)

All AI-related code lives under `tyler_python_helpers.ai`. The top-level package re-exports the same names for convenience:

```python
from tyler_python_helpers import ChatGPT, LocalAI, local_ai
```

### ChatGPT

```python
from tyler_python_helpers import ChatGPT

chat = ChatGPT()
val = chat.query("What is the capital of the moon?")
print(val)
```

`ChatGPT` chooses its backend from Cabinet at `ai -> chatgpt -> backend`:

- `"openai"`: uses the OpenAI Responses API
- `"local"`: uses `LocalAI` / Open WebUI

### LocalAI

```python
from tyler_python_helpers import LocalAI, local_ai

assert local_ai is LocalAI
print(local_ai().query("Summarize: ..."))
```

### Cabinet layout

**Cloud OpenAI** (`ChatGPT` with `ai -> chatgpt -> backend = "openai"`):

| Path | Purpose |
|------|---------|
| **`ai` → `openai` → `key`** | OpenAI API key |

In cloud mode, `ChatGPT` currently uses the fixed model `gpt-4o`.

**Local Open WebUI** — same usage shape as `ChatGPT`: `LocalAI().query(...)` or `local_ai().query(...)` (`local_ai` is an alias for `LocalAI`). Advanced: `local_ai_messages([...])`.

| Path | Purpose |
|------|---------|
| **`ai` → `local` → `openwebui_key`** | Bearer token from Open WebUI |
| **`ai` → `local` → `model`** | Model id (e.g. `qwen3:8b`) |
| **`ai` → `local` → `host`** | Optional; default `192.168.1.102` |
| **`ai` → `local` → `port`** | Optional; default `3000` |

**`ChatGPT` backend**:

| Path | Purpose |
|------|---------|
| **`ai` → `chatgpt` → `backend`** | `"openai"` (default) or `"local"` |

Legacy API key **`keys` → `openai`** is still read in cloud mode for `ChatGPT` only.

### Library calls

```python
from tyler_python_helpers import ChatGPT, LocalAI, local_ai, local_ai_messages, LocalOpenWebUIError

text = ChatGPT().query("Hello")
text = LocalAI().query("Hello")
text = local_ai().query("Hello")
text = local_ai_messages([{"role": "user", "content": "Hello"}])
```

### Errors

Cloud and local requests may raise provider-specific errors: `OpenAIError` for cloud and `LocalOpenWebUIError` for local. Both modes may also raise `ValueError` when required Cabinet settings are missing or invalid.

## Reference

- [OpenAI API documentation](https://platform.openai.com/docs/overview)
