Metadata-Version: 2.4
Name: patchpal
Version: 0.21.4
Summary: An agentic coding and automation assistant, supporting both local and cloud LLMs
Author: PatchPal Contributors
License-Expression: Apache-2.0
Project-URL: Homepage, https://github.com/amaiya/patchpal
Project-URL: Repository, https://github.com/amaiya/patchpal
Project-URL: Issues, https://github.com/amaiya/patchpal/issues
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: litellm>=1.0.0
Requires-Dist: requests>=2.31.0
Requires-Dist: beautifulsoup4>=4.12.0
Requires-Dist: ddgs>=1.0.0
Requires-Dist: rich>=13.0.0
Requires-Dist: pyyaml>=6.0.0
Requires-Dist: prompt_toolkit>=3.0.0
Requires-Dist: tiktoken>=0.5.0
Requires-Dist: boto3
Requires-Dist: pymupdf>=1.23.0
Requires-Dist: python-docx>=1.0.0
Requires-Dist: python-pptx>=0.6.0
Requires-Dist: tree-sitter-language-pack>=0.3.0
Provides-Extra: dev
Requires-Dist: pytest>=7.0.0; extra == "dev"
Requires-Dist: pytest-cov>=4.0.0; extra == "dev"
Requires-Dist: ruff==0.14.13; extra == "dev"
Requires-Dist: pre-commit>=3.0.0; extra == "dev"
Provides-Extra: mcp
Requires-Dist: mcp>=0.9.0; extra == "mcp"
Provides-Extra: docs
Requires-Dist: mkdocs<2.0,>=1.5.0; extra == "docs"
Requires-Dist: mkdocs-material>=9.5.0; extra == "docs"
Requires-Dist: mkdocs-autorefs>=1.0.0; extra == "docs"
Requires-Dist: mkdocstrings>=0.24.0; extra == "docs"
Requires-Dist: mkdocstrings-python>=1.7.0; extra == "docs"
Dynamic: license-file

# PatchPal — An Agentic Coding and Automation Assistant

<!--![PatchPal Screenshot](assets/patchpal_screenshot.png)-->
<img src="https://raw.githubusercontent.com/amaiya/patchpal/refs/heads/main/assets/patchpal_screenshot.png" alt="PatchPal Screenshot" width="650"/>

> Supporting both local and cloud LLMs, with autopilot mode and extensible tools.

**PatchPal** is an AI coding agent that helps you build software, debug issues, and automate tasks. It supports agent skills, tool use, and executable Python generation, enabling interactive workflows for tasks such as data analysis, visualization, web scraping, API interactions, and research with synthesized findings.

Most agent frameworks are [built in TypeScript](https://news.ycombinator.com/item?id=44212560). PatchPal is Python-native, designed for developers who want both interactive terminal use (`patchpal`) and programmatic API access (`agent.run("task")`) in the same tool—without switching ecosystems.

**Key Features**
- [Terminal Interface](https://amaiya.github.io/patchpal/usage/interactive/) for interactive development
- [Python SDK](https://amaiya.github.io/patchpal/usage/python-api/) for flexibility and extensibility
- [Built-In](https://amaiya.github.io/patchpal/features/tools/) and [Custom Tools](https://amaiya.github.io/patchpal/features/custom-tools/)
- [Skills System](https://amaiya.github.io/patchpal/features/skills/) and [MCP Integration](https://amaiya.github.io/patchpal/features/mcp/)
- [Autopilot Mode](https://amaiya.github.io/patchpal/usage/autopilot/) using [Ralph Wiggum loops](https://github.com/amaiya/patchpal/tree/main/examples/ralph/)
- [Project Memory](https://amaiya.github.io/patchpal/features/memory/) automatically loads project context from `~/.patchpal/repos/<repo-name>/MEMORY.md` at startup.

PatchPal prioritizes customizability: custom tools, custom skills, a flexible Python API, and support for any tool-calling LLM.

Full documentation is [here](https://amaiya.github.io/patchpal).

## Quick Start

```bash
$ pip install patchpal  # install
$ patchpal              # start
```

> Platform support: Linux, macOS, and Windows are all supported

**Alternative: Run with Docker/Podman (no installation required)**

```bash
# Using pre-built image with patchpal installed (default model)
docker run -it --rm \
  -v $(pwd):/workspace \
  -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
  ghcr.io/amaiya/patchpal-sandbox:latest \
  patchpal

# Or with Podman
podman run -it --rm \
  -v $(pwd):/workspace \
  -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
  ghcr.io/amaiya/patchpal-sandbox:latest \
  patchpal

# Specify a different model with --model
docker run -it --rm \
  -v $(pwd):/workspace \
  -e OPENAI_API_KEY=$OPENAI_API_KEY \
  ghcr.io/amaiya/patchpal-sandbox:latest \
  patchpal --model openai/gpt-5-mini
```


## Setup
0. **Install**: `pip install patchpal`
1. **Get an API key or a Local LLM Engine**:
   - **[Cloud]** For Anthropic models (default): Sign up at https://console.anthropic.com/
   - **[Cloud]** For OpenAI models: Get a key from https://platform.openai.com/
   - **[Local]** For vLLM: Install from https://docs.vllm.ai/ (free - no API charges) **Recommended for Local Use**
   - **[Local]** For Ollama: Install from https://ollama.com/ (⚠️ requires `OLLAMA_CONTEXT_LENGTH=32768` - see Ollama section below)
   - For other providers: Check the [LiteLLM documentation](https://docs.litellm.ai/docs/providers)

2. **Set up your API key as environment variable**:
```bash

# For Anthropic (default)
export ANTHROPIC_API_KEY=your_api_key_here

# For OpenAI
export OPENAI_API_KEY=your_api_key_here

# For vLLM - API key required only if configured
export HOSTED_VLLM_API_BASE=http://localhost:8000 # depends on your vLLM setup
export HOSTED_VLLM_API_KEY=token-abc123           # optional depending on your vLLM setup

# For Ollama, no API key required

# For other providers, check LiteLLM docs
```


3. **Run PatchPal**:
```bash
# Use default model (anthropic/claude-sonnet-4-5)
patchpal

# Use a specific model via command-line argument
patchpal --model openai/gpt-5.2-codex  # or openai/gpt-5-mini, anthropic/claude-opus-4-5, etc.

# Use vLLM (local)
# Note: vLLM server must be started with --tool-call-parser and --enable-auto-tool-choice
export HOSTED_VLLM_API_BASE=http://localhost:8000
export HOSTED_VLLM_API_KEY=token-abc123
patchpal --model hosted_vllm/openai/gpt-oss-120b

# Use Ollama (local - requires OLLAMA_CONTEXT_LENGTH=32768)
export OLLAMA_CONTEXT_LENGTH=32768
patchpal --model ollama_chat/gpt-oss:120b

# Or set the model via environment variable
export PATCHPAL_MODEL=openai/gpt-5.2
patchpal
```

**Tip for Local Models:** Local models (i.e., models served by Ollama or vLLM) may work better with these settings:
- `PATCHPAL_MINIMAL_TOOLS=true` and `PATCHPAL_ENABLE_WEB=false` - For models **with** function calling: Provides only essential tools (`read_file`, `read_lines`, `write_file`, `edit_file`, `run_shell`), reducing tool confusion
- `PATCHPAL_REACT_MODE=true` - For models **without** function calling: Enables text-based tool invocation (see [ReAct mode docs](https://amaiya.github.io/patchpal/models/local-models/#react-mode-for-models-without-function-calling))
- For Ollama, additionally setting `PATCHPAL_STREAM_OUTPUT=false` [may help with tool call reliability](https://github.com/openclaw/openclaw/issues/5769)


## Beyond Coding: General Problem-Solving

While originally designed for software development, PatchPal is also a general-purpose assistant. With web search, file operations, shell commands, and custom tools/skills, it can help with research, data analysis, document processing, log file analyses, etc.

<img src="https://raw.githubusercontent.com/amaiya/patchpal/refs/heads/main/assets/patchpal_assistant.png" alt="PatchPal as General Assistant" width="650"/>

## FAQ

> There are so many coding agent harnesses. Why build yet another one?

1. Most agent harnesses are in TypeScript. We wanted [something in Python](https://amaiya.github.io/patchpal/usage/python-api/) that we could easily extend for our custom workflows.
2. PatchPal includes a [unique guardrails system](https://amaiya.github.io/patchpal/configuration/#security-permissions) that is better suited to privacy-conscious use cases involving sensitive data.
3. We needed an agent harness that seamlessly works with [both local and cloud models](https://amaiya.github.io/patchpal/models/overview/#supported-models), including AWS GovCloud Bedrock models.


## Documentation

Full documentation is [available here](https://amaiya.github.io/patchpal/).
