Metadata-Version: 2.3
Name: langchain-gradientai
Version: 0.1.15
Summary: An integration package connecting Digitalocean and LangChain
License: MIT
Requires-Python: >=3.9
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Dist: c63a5cfe-b235-4fbe-8bbb-82a9e02a482a-python (>=0.1.0a17,<0.2.0)
Requires-Dist: langchain-core (>=0.3.15,<0.4.0)
Requires-Dist: langchain-tests (>=0.3.20,<0.4.0)
Requires-Dist: python-digitalocean (>=1.17.0,<2.0.0)
Requires-Dist: python-dotenv (>=1.1.1,<2.0.0)
Requires-Dist: typing_extensions (>=4.0.0,<5.0.0)
Project-URL: Repository, https://github.com/langchain-ai/langchain
Project-URL: Release Notes, https://github.com/langchain-ai/langchain/releases?q=tag%3A%22gradientai%3D%3D0%22&expanded=true
Project-URL: Source Code, https://github.com/langchain-ai/langchain/tree/master/libs/partners/gradientai
Description-Content-Type: text/markdown

# langchain-gradientai

This package contains the LangChain integration with DigitalOcean

## Installation

```bash
pip install -U langchain-gradientai
```

And you should configure credentials by setting the `DIGITALOCEAN_INFERENCE_KEY` environment variable:

1. Log in to the DigitalOcean Cloud console
2. Go to the **GradienAI Platform** and navigate to **Serverless Inference**.
2. Click on **Create model access key**, enter a name, and create the key.
3. Use the generated key as your `DIGITALOCEAN_INFERENCE_KEY`:

### 1. Copy .env.example file to create .env file
cp .env.example .env
### 2. Edit .env and add your access key:
DIGITALOCEAN_INFERENCE_KEY=your_access_key_here

## Chat Models

`ChatGradientAI` class exposes chat models from langchain-gradientai.

### Invoke

```python
from langchain_gradientai import ChatGradientAI

llm = ChatGradientAI(
    model="llama3.3-70b-instruct",
    api_key=os.getenv("DIGITALOCEAN_INFERENCE_KEY")
)

result = llm.invoke("What is the capital of France?.")
print(result)
```

### Stream

```python
from langchain_gradientai import ChatGradientAI

llm = ChatGradientAI(
    model="llama3.3-70b-instruct",
    api_key=os.getenv("DIGITALOCEAN_INFERENCE_KEY")
)

for chunk in llm.stream("Tell me what happened to the Dinosaurs?"):
    print(chunk.content, end="", flush=True)

```

More features coming soon.

