Metadata-Version: 2.4
Name: libagent-ai
Version: 0.1.1
Summary: A simple LLM client wrapper for OpenAI-compatible APIs
Project-URL: Homepage, https://github.com/coustea/agentflow
Project-URL: Repository, https://github.com/coustea/agentflow
Author-email: coustea <liushaojie2006@gmail.com>
License: MIT
Keywords: ai,chatgpt,llm,openai
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Python: >=3.10
Requires-Dist: openai>=2.15.0
Requires-Dist: python-dotenv>=1.2.1
Description-Content-Type: text/markdown

# LLM Agent

A simple LLM client and AI Agent framework for OpenAI-compatible APIs.

## Features

- **LLM**: Basic client for OpenAI-compatible APIs with streaming support
- **ReActAgent**: Reasoning and Acting agent (coming soon)
- **PlanAndSolveAgent**: Planning and problem-solving agent (coming soon)
- **ReflectionAgent**: Self-reflective agent (coming soon)

## Installation

```bash
pip install libagent-ai
```

## Usage

### LLM Client

```python
from libagent_ai import LLM

llm = LLM(
    model="glm-4.7",
    apiKey="your-api-key",
    baseUrl="https://api.example.com/v1"
)

messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Hello!"}
]

response = llm.think(messages)
print(response)
```

### Using Environment Variables

Create a `.env` file:

```bash
LLM_API_KEY=your-api-key
LLM_BASE_URL=https://api.example.com/v1
```

Then use:

```python
from libagent_ai import LLM

llm = LLM(model="glm-4.7")
response = llm.think(messages)
```

## Development

```bash
# Install dependencies
uv sync

# Run tests
uv run python -m libagent_ai.llm
```

## License

MIT