Metadata-Version: 2.4
Name: lazybridge
Version: 0.4.0
Summary: Zero-boilerplate multi-provider LLM agent framework
Project-URL: Homepage, https://github.com/selvaz/LazyBridge
Project-URL: Repository, https://github.com/selvaz/LazyBridge
Project-URL: Documentation, https://github.com/selvaz/LazyBridge/tree/main/lazy_wiki
Project-URL: Changelog, https://github.com/selvaz/LazyBridge/blob/main/CHANGELOG.md
Project-URL: Bug Tracker, https://github.com/selvaz/LazyBridge/issues
Author: Marco Selvatici
License: Apache-2.0
License-File: LICENSE
Keywords: agents,anthropic,google,llm,multi-agent,openai,pipeline,tool-use
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.11
Requires-Dist: pydantic>=2.0.0
Provides-Extra: all
Requires-Dist: anthropic>=0.49.0; extra == 'all'
Requires-Dist: google-genai>=1.10.0; extra == 'all'
Requires-Dist: openai>=1.70.0; extra == 'all'
Requires-Dist: pypdf>=3.0; extra == 'all'
Requires-Dist: python-docx>=1.0; extra == 'all'
Requires-Dist: pyyaml>=6.0; extra == 'all'
Requires-Dist: trafilatura>=1.6; extra == 'all'
Provides-Extra: anthropic
Requires-Dist: anthropic>=0.49.0; extra == 'anthropic'
Provides-Extra: deepseek
Requires-Dist: openai>=1.70.0; extra == 'deepseek'
Provides-Extra: google
Requires-Dist: google-genai>=1.10.0; extra == 'google'
Provides-Extra: openai
Requires-Dist: openai>=1.70.0; extra == 'openai'
Provides-Extra: tools
Requires-Dist: pypdf>=3.0; extra == 'tools'
Requires-Dist: python-docx>=1.0; extra == 'tools'
Requires-Dist: trafilatura>=1.6; extra == 'tools'
Provides-Extra: yaml
Requires-Dist: pyyaml>=6.0; extra == 'yaml'
Description-Content-Type: text/markdown

# LazyBridge

Zero-boilerplate multi-provider LLM agent framework. One class for every LLM interaction, automatic tool schema generation, composable context injection, and serializable multi-agent pipelines.

## Quick start

```python
from lazybridge import LazyAgent

ai = LazyAgent("anthropic")
print(ai.text("What is the capital of France?"))
```

Same code on any provider — change one string:

```python
LazyAgent("openai")
LazyAgent("google")
LazyAgent("deepseek")
```

## Tool loop

```python
from lazybridge import LazyAgent, LazyTool

def get_weather(city: str) -> str:
    """Get current weather for a city."""
    return f"{city}: 22°C, sunny"

result = LazyAgent("anthropic").loop(
    "What's the weather in Rome and Paris?",
    tools=[LazyTool.from_function(get_weather)],
)
print(result.content)
```

Schema generated automatically from type hints and docstring. No JSON dict, no decorator boilerplate.

## Conversational memory

```python
from lazybridge import LazyAgent, Memory

ai  = LazyAgent("anthropic")
mem = Memory()

ai.chat("My name is Marco", memory=mem)
resp = ai.chat("What's my name?", memory=mem)
print(resp.content)   # "Marco"
```

## Structured output

```python
from pydantic import BaseModel

class Article(BaseModel):
    title: str
    summary: str
    tags: list[str]

article = LazyAgent("openai").json("Summarise AI in 2025", Article)
print(article.title)
```

## Multi-agent pipeline

```python
from lazybridge import LazyAgent, LazySession, LazyContext, LazyTool

sess       = LazySession()
researcher = LazyAgent("anthropic", name="researcher", session=sess)
writer     = LazyAgent("openai",    name="writer",     session=sess)

search_tool = LazyTool.from_function(lambda query: f"Papers about {query}")
researcher.loop("Find top 3 AI papers this week", tools=[search_tool])
result = writer.chat(
    "Write a blog post",
    context=LazyContext.from_agent(researcher),
)
print(result.content)
print(sess.graph.to_json())   # serializable pipeline topology for GUI
```

## Native provider tools (web search, code execution, …)

```python
from lazybridge.core.types import NativeTool

resp = ai.chat(
    "What happened in AI this week?",
    native_tools=[NativeTool.WEB_SEARCH],
)
for src in resp.grounding_sources:
    print(src.url, src.title)
```

## Supported providers

| Provider | String | Default model |
|---|---|---|
| Anthropic | `"anthropic"` / `"claude"` | claude-sonnet-4-6 |
| OpenAI | `"openai"` / `"gpt"` | gpt-5.4 |
| Google | `"google"` / `"gemini"` | gemini-2.5-flash |
| DeepSeek | `"deepseek"` | deepseek-chat |

## Installation

```bash
pip install lazybridge

# Provider extras (choose what you need)
pip install lazybridge[anthropic]   # Anthropic / Claude
pip install lazybridge[openai]      # OpenAI / GPT
pip install lazybridge[google]      # Google / Gemini
pip install lazybridge[all]         # all providers
```

## Ready-made tools

Drop-in tools for common agent tasks — each in its own folder with a README and tests.

| Module | What it does |
|---|---|
| `lazybridge.tools.doc_skills` | Index local docs with BM25, query from any agent. No vector DB, no embeddings API. |
| `lazybridge.tools.read_docs` | Read `.txt .md .pdf .docx .html` from a folder or single file. `pip install lazybridge[tools]` |

### doc_skills — example

```python
from lazybridge.tools.doc_skills import build_skill, skill_tool
from lazybridge import LazyAgent

# Index your docs once — bundle persists to disk
meta = build_skill(["./docs"], "my-project")

# Load and use — works across restarts, no re-indexing
tool = skill_tool(meta["skill_dir"])
resp = LazyAgent("anthropic").loop("How does X work?", tools=[tool])
print(resp.content)
```

### read_docs — example

```python
from lazybridge.tools.read_docs import read_folder_docs
from lazybridge import LazyAgent, LazyTool

docs_tool = LazyTool.from_function(read_folder_docs)
resp = LazyAgent("anthropic").loop(
    "Summarise all PDFs in /reports",
    tools=[docs_tool],
)
print(resp.content)
```

---

## Project structure

```
LazyBridge/
├── lazybridge/      # Installable package (pip install lazybridge)
│   ├── lazy_agent.py         # LazyAgent — single entry point for LLM calls
│   ├── lazy_session.py       # LazySession — shared store, events, graph
│   ├── lazy_tool.py          # LazyTool — tool schema + execution
│   ├── lazy_context.py       # LazyContext — composable system prompt injection
│   ├── lazy_store.py         # LazyStore — flat key-value blackboard (SQLite or in-memory)
│   ├── lazy_router.py        # LazyRouter — conditional branching node
│   ├── memory.py             # Memory — stateful conversation history
│   ├── graph/                # GraphSchema — serializable pipeline topology
│   └── core/                 # Provider adapters, executor, tool schema builder
├── tools/           # Tests and READMEs for lazybridge.tools
│   ├── doc_skills/           # test_doc_skills.py + README
│   └── read_docs/            # README
└── lazy_wiki/
    ├── bot/                  # LLM-optimised reference (exhaustive, structured)
    └── human/                # Human-readable guides and SDK comparison
```

## Documentation

| Audience | Entry point |
|---|---|
| Developer | [`lazy_wiki/human/quickstart.md`](lazy_wiki/human/quickstart.md) |
| SDK comparison | [`lazy_wiki/human/comparison.md`](lazy_wiki/human/comparison.md) |
| LLM / AI assistant | [`lazy_wiki/bot/INDEX.md`](lazy_wiki/bot/INDEX.md) |
| Full API reference | [`lazy_wiki/bot/00_quickref.md`](lazy_wiki/bot/00_quickref.md) |

## License

[Apache 2.0](LICENSE)
