Metadata-Version: 2.1
Name: rusticai-api
Version: 0.0.4
Summary: API Server for Rustic AI
Home-page: https://www.rustic.ai/
License: Apache-2.0
Author: Dragonscale Industries Inc.
Author-email: dev@dragonscale.ai
Requires-Python: >=3.12,<4.0
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Provides-Extra: test
Requires-Dist: fastapi (>=0.115.8,<0.116.0)
Requires-Dist: griffe (>=1.7.2,<2.0.0)
Requires-Dist: opentelemetry-distro (>=0.52b1,<0.53)
Requires-Dist: psycopg (>=3.2.6,<4.0.0)
Requires-Dist: ptvsd (>=4.3.2,<5.0.0)
Requires-Dist: python-multipart (>=0.0.20,<0.0.21)
Requires-Dist: rusticai-core (==0.0.6)
Requires-Dist: rusticai-ray (==0.0.5)
Requires-Dist: starlette (>=0.45.3,<0.46.0)
Requires-Dist: uvicorn (>=0.34.0,<0.35.0)
Requires-Dist: websockets (>=15.0.1,<16.0.0)
Project-URL: Repository, https://github.com/rustic-ai/python-framework
Project-URL: Rustic AI Core, https://pypi.org/project/rusticai-core/
Description-Content-Type: text/markdown

# Rustic AI API
This module provides the backend server for the Rustic AI framework. It provides the interface for creating and interacting with guilds.
The interaction with a guild is supported through a Websocket interface, allowing for real-time communication and updates.

## Installing

```shell
pip install rusticai-api
```
**Note:** It depends on [rusticai-core](https://pypi.org/project/rusticai-core/) and [rusticai-ray](https://pypi.org/project/rusticai-ray/).

## Running from source

1. Install required dependencies:
```shell
poetry install --with dev
poetry shell
```

2. Start the server:
```shell
# If using an external SQL database, expose RUSTIC_METASTORE to the corresponding url 
# For example, if using postgres, export RUSTIC_METASTORE=postgresql+psycopg://user:pwd@localhost:5432
./scripts/dev_server.sh
```

Server will be available at `http://localhost:8880` by default. The API documentation can be accessed at `http://localhost:8880/docs`.

## Running from source with Telemetry

1. Install required dependencies:
```shell
poetry install --with dev
poetry shell
```

2. Start Zipkin server - requires [Docker](https://www.docker.com/get-started/)
```shell
sudo chmod 777 scripts/zipkin/data-tmp/
./scripts/zipkin/zipkin_up.sh
```

3. Set the otel env variables -
```shell
export OTEL_SERVICE_NAME=GuildCommunicationService
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="http://localhost:4318/v1/traces"
export OTEL_EXPORTER_OTLP_PROTOCOL="http/protobuf"
```
Refer [docs](https://opentelemetry.io/docs/languages/sdk-configuration/otlp-exporter/#endpoint-configuration)
for details.

3. Start the server -
```shell
./scripts/dev_server_with_otel.sh
```
Traces will be visible in Zipkin UI at http://localhost:9411/zipkin/

Note: To stop the Zipkin server, use `./scripts/zipkin/zipkin_down.sh`

