Metadata-Version: 2.4
Name: nineth
Version: 0.4.27
Summary:  model sdk built by the 9th ditrict at tooig
Project-URL: Homepage, https://github.com/districtt/rooster
Project-URL: Bug Tracker, https://github.com/districtt/rooster/issues
Author-email: "Tooig, Inc" <tooighq@gmail.com>, Oyebamijo <boy@oyebamijo.com>
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Requires-Python: >=3.10
Requires-Dist: httpx<1.0,>=0.27.2
Description-Content-Type: text/markdown

# nineth

`nineth` is the Python SDK for the 1984 model API, built by the 9th District at Tooig.

---

## Install

```bash
pip install nineth
export NINETH_API_KEY="your-api-key"
```

---

## How it works

Every request goes through `client.model.request(...)`.

- Pass a task. Get a response.
- Set `stream=True` to receive text as it arrives, word by word.
- Everything else — caching, service routing, continuity — is handled server-side.

---

## Models

| Name | Description |
|---|---|
| `1984-m3-0317` | Most capable. Best for research and complex tasks. |
| `1984-m2-preview` | Fast and powerful. Good for most tasks. |
| `1984-m2-light` | Lightweight, quick general tasks. |
| `1984-m1-unified` | High-throughput unified model. |
| `1984-m0-brute` | Compact efficient model. |
| `1984-m0-sm` | Smallest model, fastest responses. |

Set a default at client creation or pass `model=` per call.

---

## Cookbook

### 1 — Get a response

The simplest case. Ask something, get the answer.

```python
from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    response = client.model.request("Give me a tight BTC market brief.")
    print(response["final_response"])
```

`response` is a plain dict. The text is always in `response["final_response"]`.

---

### 2 — Stream the response live

Set `stream=True` to print text as it arrives.

```python
from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    for event in client.model.request("Summarise crude oil today.", stream=True):
        if event["type"] == "model_delta":
            print(event["data"]["text"], end="", flush=True)
```

The last event in the stream is `type: result` and contains the full `final_response`
alongside `iterations`.

---

### 3 — Choose a different model per request

```python
from nineth import NinethClient

with NinethClient() as client:
    response = client.model.request(
        "What happened with Nvidia earnings?",
        model="1984-m2-light",
    )
    print(response["final_response"])
```

---

### 4 — Control reasoning depth

Use `reasoning` to hint at how deeply the model should think before answering.
Valid values: `"low"`, `"medium"`, `"high"`. Leave it out to use the model default.

```python
from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    response = client.model.request(
        "Analyse the macro impact of a Fed rate pause.",
        reasoning="high",
    )
    print(response["final_response"])
```

---

### 5 — Show the model's reasoning

Set `show_reasoning=True` to include the model's internal chain-of-thought.
This is off by default.

```python
from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    response = client.model.request(
        "Walk me through whether gold is trending or ranging.",
        reasoning="medium",
        show_reasoning=True,
    )
    for block in response.get("thinking", []):
        print("[thinking]", block)
    print(response["final_response"])
```

---

### 6 — Limit how many turns the model takes

`max_iterations` controls how many model turns the server runs.
The default is `10`. Most tasks finish in 1–3 turns.

```python
from nineth import NinethClient

with NinethClient(default_model="1984-m3-0317") as client:
    response = client.model.request(
        "Give me a one-paragraph ETH brief.",
        max_iterations=2,
    )
    print(response["final_response"])
```

---

### 7 — Async usage

```python
import asyncio
from nineth import AsyncNinethClient

async def main():
    async with AsyncNinethClient(default_model="1984-m3-0317") as client:
        response = await client.model.request(
            "Summarise macro risk factors this week.",
        )
        print(response["final_response"])

asyncio.run(main())
```

Async streaming works the same way:

```python
import asyncio
from nineth import AsyncNinethClient

async def main():
    async with AsyncNinethClient(default_model="1984-m3-0317") as client:
        async for event in await client.model.request(
            "Research BTC ETF flows.", stream=True
        ):
            if event["type"] == "model_delta":
                print(event["data"]["text"], end="", flush=True)

asyncio.run(main())
```

---

### 8 — Health check

No API key needed. Use this to verify the endpoint is reachable.

```python
from nineth import NinethClient

with NinethClient() as client:
    print(client.health())
# {'status': 'ok', 'timestamp': '2026-04-04T00:00:00+00:00'}
```

---

### 9 — Point the SDK at a different endpoint

```python
from nineth import NinethClient

with NinethClient(
    base_url="https://your-deployment.modal.run",
    api_key="your-key",
    default_model="1984-m3-0317",
) as client:
    response = client.model.request("Hello.")
    print(response["final_response"])
```

Or use environment variables:

```bash
export NINETH_BASE_URL="https://your-deployment.modal.run"
export NINETH_MODEL="1984-m3-0317"
```

---

## Response shape

### Buffered (`stream=False`)

```python
{
    "final_response": "Bitcoin is trading near...",
    "iterations": 2,
    "usage": {"prompt_tokens": 412, "completion_tokens": 88, "total_tokens": 500},
    "thinking": [],          # only populated when show_reasoning=True
    "service_calls": [...],
    "service_responses": [...],
    "events": [...],
}
```

Only `final_response` and `iterations` are guaranteed to be present on every response.

### Streaming (`stream=True`)

Each loop iteration yields a dict:

```python
# Text arriving live
{"type": "model_delta", "data": {"text": "Bitcoin is trading..."}}

# Tool calls the model made
{"type": "service_call",     "data": {"service_name": "search_web", "params": {...}}}
{"type": "service_response", "data": {"service_name": "search_web", "success": True, "summary": {...}}}

# Final summary — always the last event
{"type": "result", "data": {"final_response": "...", "iterations": 2}}
```

---

## Error handling

```python
from nineth import NinethClient, NinethAPIError

with NinethClient(default_model="1984-m3-0317") as client:
    try:
        response = client.model.request("Analyse ETH.")
    except NinethAPIError as exc:
        print("API error:", exc)
    except ValueError as exc:
        print("Configuration error:", exc)
```

---

## Authentication

Set `NINETH_API_KEY` in your environment or pass `api_key=` to the client constructor.
The health check endpoint does not require a key.
