Metadata-Version: 2.4
Name: npcpy
Version: 1.3.30
Summary: npcpy is the premier open-source library for integrating LLMs and Agents into python systems.
Home-page: https://github.com/NPC-Worldwide/npcpy
Author: Christopher Agostino
Author-email: info@npcworldwi.de
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: jinja2
Requires-Dist: litellm
Requires-Dist: scipy
Requires-Dist: numpy
Requires-Dist: requests
Requires-Dist: docx
Requires-Dist: exa-py
Requires-Dist: elevenlabs
Requires-Dist: matplotlib
Requires-Dist: markdown
Requires-Dist: networkx
Requires-Dist: PyYAML
Requires-Dist: PyMuPDF
Requires-Dist: pyautogui
Requires-Dist: pydantic
Requires-Dist: pygments
Requires-Dist: sqlalchemy
Requires-Dist: termcolor
Requires-Dist: rich
Requires-Dist: colorama
Requires-Dist: docstring_parser
Requires-Dist: Pillow
Requires-Dist: python-dotenv
Requires-Dist: pandas
Requires-Dist: beautifulsoup4
Requires-Dist: duckduckgo-search
Requires-Dist: flask
Requires-Dist: flask_cors
Requires-Dist: redis
Requires-Dist: psycopg2-binary
Requires-Dist: flask_sse
Requires-Dist: mcp
Provides-Extra: lite
Requires-Dist: anthropic; extra == "lite"
Requires-Dist: openai; extra == "lite"
Requires-Dist: ollama; extra == "lite"
Requires-Dist: google-generativeai; extra == "lite"
Requires-Dist: google-genai; extra == "lite"
Provides-Extra: local
Requires-Dist: sentence_transformers; extra == "local"
Requires-Dist: opencv-python; extra == "local"
Requires-Dist: ollama; extra == "local"
Requires-Dist: chromadb; extra == "local"
Requires-Dist: diffusers; extra == "local"
Requires-Dist: torch; extra == "local"
Requires-Dist: datasets; extra == "local"
Requires-Dist: airllm; extra == "local"
Provides-Extra: yap
Requires-Dist: pyaudio; extra == "yap"
Requires-Dist: gtts; extra == "yap"
Requires-Dist: playsound==1.2.2; extra == "yap"
Requires-Dist: pygame; extra == "yap"
Requires-Dist: faster_whisper; extra == "yap"
Requires-Dist: pyttsx3; extra == "yap"
Provides-Extra: all
Requires-Dist: anthropic; extra == "all"
Requires-Dist: openai; extra == "all"
Requires-Dist: ollama; extra == "all"
Requires-Dist: google-generativeai; extra == "all"
Requires-Dist: google-genai; extra == "all"
Requires-Dist: sentence_transformers; extra == "all"
Requires-Dist: opencv-python; extra == "all"
Requires-Dist: ollama; extra == "all"
Requires-Dist: chromadb; extra == "all"
Requires-Dist: diffusers; extra == "all"
Requires-Dist: torch; extra == "all"
Requires-Dist: datasets; extra == "all"
Requires-Dist: airllm; extra == "all"
Requires-Dist: pyaudio; extra == "all"
Requires-Dist: gtts; extra == "all"
Requires-Dist: playsound==1.2.2; extra == "all"
Requires-Dist: pygame; extra == "all"
Requires-Dist: faster_whisper; extra == "all"
Requires-Dist: pyttsx3; extra == "all"
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: license-file
Dynamic: provides-extra
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

<p align="center">
  <a href="https://npcpy.readthedocs.io/">
  <img src="https://raw.githubusercontent.com/cagostino/npcpy/main/npcpy/npc-python.png" alt="npc-python logo" width=250></a>
</p>

# npcpy

`npcpy` is a flexible agent framework for building AI applications and conducting research with LLMs. It supports local and cloud providers, multi-agent teams, tool calling, image/audio/video generation, knowledge graphs, fine-tuning, and more.

```bash
pip install npcpy
```

## Quick Examples

### Agent with persona

```python
from npcpy.npc_compiler import NPC

simon = NPC(
    name='Simon Bolivar',
    primary_directive='Liberate South America from the Spanish Royalists.',
    model='gemma3:4b',
    provider='ollama'
)
response = simon.get_llm_response("What is the most important territory to retain in the Andes?")
print(response['response'])
```

### Direct LLM call

```python
from npcpy.llm_funcs import get_llm_response

response = get_llm_response("Who was the celtic messenger god?", model='qwen3:4b', provider='ollama')
print(response['response'])
```

### Agent with tools

```python
import os
from npcpy.npc_compiler import NPC

def list_files(directory: str = ".") -> list:
    """List all files in a directory."""
    return os.listdir(directory)

def read_file(filepath: str) -> str:
    """Read and return the contents of a file."""
    with open(filepath, 'r') as f:
        return f.read()

assistant = NPC(
    name='File Assistant',
    primary_directive='You help users explore files.',
    model='llama3.2',
    provider='ollama',
    tools=[list_files, read_file],
)
response = assistant.get_llm_response("List the files in the current directory.")
print(response['response'])

# Access individual tool results
for result in response.get('tool_results', []):
    print(f"{result['tool_name']}: {result['result']}")
```

### Streaming responses

```python
from npcpy.llm_funcs import get_llm_response

response = get_llm_response(
    "Tell me about the history of the Inca Empire.",
    model='llama3.2',
    provider='ollama',
    stream=True
)

for chunk in response['response']:
    msg = chunk.get('message', {})
    print(msg.get('content', ''), end='', flush=True)
```

### JSON output

```python
from npcpy.llm_funcs import get_llm_response

response = get_llm_response(
    "List 3 planets with their distances from the sun in AU.",
    model='llama3.2',
    provider='ollama',
    format='json'
)
print(response['response'])
```

### Multi-agent team orchestration

```python
from npcpy.npc_compiler import NPC, Team

# Create specialist agents
coordinator = NPC(
    name='coordinator',
    primary_directive='''You coordinate a team of specialists.
    Delegate tasks by mentioning @analyst for data questions or @writer for content.
    Synthesize their responses into a final answer.''',
    model='llama3.2',
    provider='ollama'
)

analyst = NPC(
    name='analyst',
    primary_directive='You analyze data and provide insights with specific numbers.',
    model='~/models/mistral-7b-instruct-v0.2.Q4_K_M.gguf',
    provider='llamacpp'
)

writer = NPC(
    name='writer',
    primary_directive='You write clear, engaging summaries and reports.',
    model='gemini-2.5-flash',
    provider='gemini'
)

# Create team - coordinator (forenpc) automatically delegates via @mentions
team = Team(npcs=[coordinator, analyst, writer], forenpc='coordinator')

# Orchestrate a request - coordinator decides who to involve
result = team.orchestrate("What are the trends in renewable energy adoption?")
print(result['output'])
```

### Initialize a team

Installing `npcpy` also installs two command-line tools:
- **`npc`** — CLI for project management and one-off commands
- **`npcsh`** — Interactive shell for chatting with agents and running jinxs

```bash
# Using npc CLI
npc init ./my_project

# Using npcsh (interactive)
npcsh
📁 ~/projects
🤖 npcsh | llama3.2
> /init directory=./my_project
> what files are in the current directory?
```

This creates:
```
my_project/
├── npc_team/
│   ├── forenpc.npc      # Default coordinator
│   ├── jinxs/           # Workflows
│   │   └── skills/      # Knowledge skills
│   ├── tools/           # Custom tools
│   └── triggers/        # Event triggers
├── images/
├── models/
└── mcp_servers/
```

Then add your agents:
```bash
# Add team context
cat > my_project/npc_team/team.ctx << 'EOF'
context: Research and analysis team
forenpc: lead
model: llama3.2
provider: ollama
EOF

# Add agents
cat > my_project/npc_team/lead.npc << 'EOF'
name: lead
primary_directive: |
  You lead the team. Delegate to @researcher for data
  and @writer for content. Synthesize their output.
EOF

cat > my_project/npc_team/researcher.npc << 'EOF'
name: researcher
primary_directive: You research topics and provide detailed findings.
model: gemini-2.5-flash
provider: gemini
EOF

cat > my_project/npc_team/writer.npc << 'EOF'
name: writer
primary_directive: You write clear, engaging content.
model: qwen3:8b
provider: ollama
EOF
```

### Team directory structure

```
npc_team/
├── team.ctx           # Team configuration
├── coordinator.npc    # Coordinator agent
├── analyst.npc        # Specialist agent
├── writer.npc         # Specialist agent
└── jinxs/             # Optional workflows
    └── research.jinx
```

**team.ctx** - Team configuration:
```yaml
context: |
  A research team that analyzes topics and produces reports.
  The coordinator delegates to specialists as needed.
forenpc: coordinator
model: llama3.2
provider: ollama
mcp_servers:
  - ~/.npcsh/mcp_server.py
```

**coordinator.npc** - Agent definition:
```yaml
name: coordinator
primary_directive: |
  You coordinate research tasks. Delegate to @analyst for data
  analysis and @writer for content creation. Synthesize results.
model: llama3.2
provider: ollama
```

**analyst.npc** - Specialist agent:
```yaml
name: analyst
primary_directive: |
  You analyze data and provide insights with specific numbers and trends.
model: qwen3:8b
provider: ollama
```

### Team from directory

```python
from npcpy.npc_compiler import Team

# Load team from directory with .npc files and team.ctx
team = Team(team_path='./npc_team')

# Orchestrate through the forenpc (set in team.ctx)
result = team.orchestrate("Analyze the sales data and write a summary")
print(result['output'])
```

### Agent with skills

Skills are knowledge-content jinxs that provide instructional sections to agents on demand.

**1. Create a skill file** (`npc_team/jinxs/skills/code-review/SKILL.md`):
```markdown
---
name: code-review
description: Use when reviewing code for quality, security, and best practices.
---
# Code Review Skill

## checklist
- Check for security vulnerabilities (SQL injection, XSS, etc.)
- Verify error handling and edge cases
- Review naming conventions and code clarity

## security
Focus on OWASP top 10 vulnerabilities...
```

**2. Reference it in your NPC** (`npc_team/reviewer.npc`):
```yaml
name: reviewer
primary_directive: You review code for quality and security issues.
model: llama3.2
provider: ollama
jinxs:
  - skills/code-review
```

**3. Use the NPC:**
```python
from npcpy.npc_compiler import NPC

# Load NPC from file - skills are automatically available as callable jinxs
reviewer = NPC(file='./npc_team/reviewer.npc')
response = reviewer.get_llm_response("Review this function: def login(user, pwd): ...")
print(response['response'])
```

Skills let the agent request specific knowledge sections (like `checklist` or `security`) as needed during responses.

### Agent with MCP server

Connect any MCP server to an NPC and its tools become available for agentic tool calling:

```python
from npcpy.npc_compiler import NPC
from npcpy.serve import MCPClientNPC

# Connect to your MCP server
mcp = MCPClientNPC()
mcp.connect_sync('./my_mcp_server.py')

# Create an NPC
assistant = NPC(
    name='Assistant',
    primary_directive='You help users with tasks using available tools.',
    model='llama3.2',
    provider='ollama'
)

# Pass MCP tools to get_llm_response - the agent handles tool calls automatically
response = assistant.get_llm_response(
    "Search the database for recent orders",
    tools=mcp.available_tools_llm,
    tool_map=mcp.tool_map
)
print(response['response'])

# Clean up when done
mcp.disconnect_sync()
```

Example MCP server (`my_mcp_server.py`):
```python
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("My Tools")

@mcp.tool()
def search_database(query: str) -> str:
    """Search the database for records matching the query."""
    return f"Found results for: {query}"

@mcp.tool()
def send_notification(message: str, channel: str = "general") -> str:
    """Send a notification to a channel."""
    return f"Sent '{message}' to #{channel}"

if __name__ == "__main__":
    mcp.run()
```

**MCPClientNPC methods:**
- `connect_sync(server_path)` — Connect to an MCP server script
- `disconnect_sync()` — Disconnect from the server
- `available_tools_llm` — Tool schemas for LLM consumption
- `tool_map` — Dict mapping tool names to callable functions

### Image generation

```python
from npcpy.llm_funcs import gen_image

images = gen_image("A sunset over the mountains", model='sdxl', provider='diffusers')
images[0].save("sunset.png")
```

## Features

- **[Agents (NPCs)](https://npcpy.readthedocs.io/en/latest/guides/agents/)** — Agents with personas, directives, and tool calling
- **[Multi-Agent Teams](https://npcpy.readthedocs.io/en/latest/guides/teams/)** — Team orchestration with a coordinator (forenpc)
- **[Jinx Workflows](https://npcpy.readthedocs.io/en/latest/guides/jinx-workflows/)** — Jinja Execution templates for multi-step prompt pipelines
- **[Skills](https://npcpy.readthedocs.io/en/latest/guides/skills/)** — Knowledge-content jinxs that serve instructional sections to agents on demand
- **[NPCArray](https://npcpy.readthedocs.io/en/latest/guides/npc-array/)** — NumPy-like vectorized operations over model populations
- **[Image, Audio & Video](https://npcpy.readthedocs.io/en/latest/guides/image-audio-video/)** — Generation via Ollama, diffusers, OpenAI, Gemini
- **[Knowledge Graphs](https://npcpy.readthedocs.io/en/latest/guides/knowledge-graphs/)** — Build and evolve knowledge graphs from text
- **[Fine-Tuning & Evolution](https://npcpy.readthedocs.io/en/latest/guides/fine-tuning/)** — SFT, RL, diffusion, genetic algorithms
- **[Serving](https://npcpy.readthedocs.io/en/latest/guides/serving/)** — Flask server for deploying teams via REST API
- **[ML Functions](https://npcpy.readthedocs.io/en/latest/guides/ml-funcs/)** — Scikit-learn grid search, ensemble prediction, PyTorch training
- **[Streaming & JSON](https://npcpy.readthedocs.io/en/latest/guides/llm-responses/)** — Streaming responses, structured JSON output, message history

## Providers

Works with all major LLM providers through LiteLLM: `ollama`, `openai`, `anthropic`, `gemini`, `deepseek`, `airllm`, `openai-like`, and more.

## Installation

```bash
pip install npcpy              # base
pip install npcpy[lite]        # + API provider libraries
pip install npcpy[local]       # + ollama, diffusers, transformers, airllm
pip install npcpy[yap]         # + TTS/STT
pip install npcpy[all]         # everything
```

<details><summary>System dependencies</summary>

**Linux:**
```bash
sudo apt-get install espeak portaudio19-dev python3-pyaudio ffmpeg libcairo2-dev libgirepository1.0-dev
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama3.2
```

**macOS:**
```bash
brew install portaudio ffmpeg pygobject3 ollama
brew services start ollama
ollama pull llama3.2
```

**Windows:** Install [Ollama](https://ollama.com) and [ffmpeg](https://ffmpeg.org), then `ollama pull llama3.2`.

</details>

API keys go in a `.env` file:
```bash
export OPENAI_API_KEY="your_key"
export ANTHROPIC_API_KEY="your_key"
export GEMINI_API_KEY="your_key"
```

## Read the Docs

Full documentation, guides, and API reference at [npcpy.readthedocs.io](https://npcpy.readthedocs.io/en/latest/).

## Inference Capabilities

Works with local and cloud providers through LiteLLM (Ollama, OpenAI, Anthropic, Gemini, Deepseek, and more) with support for text, image, audio, and video generation.

## Links

- **[Incognide](https://github.com/cagostino/incognide)** — GUI for the NPC Toolkit ([download](https://enpisi.com/incognide))
- **[NPC Shell](https://github.com/npc-worldwide/npcsh)** — Command-line shell for interacting with NPCs
- **[Newsletter](https://forms.gle/n1NzQmwjsV4xv1B2A)** — Stay in the loop

## Research

- Quantum-like nature of natural language interpretation: [arxiv](https://arxiv.org/abs/2506.10077), accepted at [QNLP 2025](https://qnlp.ai)
- Simulating hormonal cycles for AI: [arxiv](https://arxiv.org/abs/2508.11829)

Has your research benefited from npcpy? Let us know!

## Support

[Monthly donation](https://buymeacoffee.com/npcworldwide) | [Merch](https://enpisi.com/shop) | Consulting: info@npcworldwi.de

## Contributing

Contributions welcome! Submit issues and pull requests on the [GitHub repository](https://github.com/NPC-Worldwide/npcpy).

## License

MIT License.

## Star History

[![Star History Chart](https://api.star-history.com/svg?repos=cagostino/npcpy&type=Date)](https://star-history.com/#cagostino/npcpy&Date)
