Metadata-Version: 2.4
Name: py_aide
Version: 0.17.0
Summary: Modern Python 3.11+ Framework: Flask/FastAPI orchestration, and strict runtime enforcement.
Author-email: Kakuru Douglas <vicaniddouglas@gmail.com>
License: MIT
Project-URL: Homepage, https://gitlab.com/vicaniddouglas/py_aide
Project-URL: Bug Tracker, https://gitlab.com/vicaniddouglas/py_aide/-/issues
Keywords: framework,flask,fastapi,runtime-enforcement,threading,websockets
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: POSIX :: Linux
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
Requires-Python: >=3.11
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: flask>=3.0.0
Requires-Dist: flask-cors>=4.0.0
Requires-Dist: flask-socketio>=5.3.6
Requires-Dist: eventlet>=0.33.3
Requires-Dist: fastapi>=0.100.0
Requires-Dist: cryptography>=41.0.0
Requires-Dist: argon2-cffi>=23.1.0
Requires-Dist: pydantic>=1.10.0
Requires-Dist: httpx>=0.28.1
Provides-Extra: dev
Requires-Dist: pytest>=7.0.0; extra == "dev"
Requires-Dist: pytest-cov>=4.0.0; extra == "dev"
Requires-Dist: black>=23.0.0; extra == "dev"
Requires-Dist: isort>=5.12.0; extra == "dev"
Requires-Dist: mypy>=1.5.0; extra == "dev"
Dynamic: license-file

# py_aide

**Modern Python 3.11+ Framework: Flask/FastAPI Orchestration & Strict Runtime Enforcement**

`py_aide` is a robust, developer-centric framework designed to bring safety, structure, and consistency to Python web applications. It provides a unique "Strict Runtime Enforcement" layer that ensures your code remains clean, documented, and type-safe at execution time.

## 🚀 Key Features

- **Strict Runtime Enforcement**: A powerful decorator that enforces type hints, docstrings, and calling conventions (positional-only/keyword-only) at runtime.
- **Unified Server Portal**: Seamlessly orchestrate Flask and FastAPI applications with shared security gates and auto-discovery of routes.
- **WebSocket Excellence**: High-performance, identity-aware WebSockets with support for User and Group-based messaging, automatic handshake mapping, and tiered delivery.
- **Thread-Safe SQL**: A thread-level multiton SQLite manager with automatic JSON serialization, transaction tracking, and schema management.
- **Modern Security**: First-class support for Bearer tokens, API keys, and Fernet-based encryption.
- **Enterprise HTTP Client**: A standardized, "always resolve" HTTP client with built-in retries, interceptors, and dual sync/async support.
- **Rich Utilities**: Built-in handlers for images (base64/files), dates (aware/naive conversions), and custom data structures.

## 📦 Installation
```bash
pip install py_aide
```

> **Note**: `py_aide` requires Python 3.11+ and is currently optimized for Linux environments.

## ⚙️ Core Philosophy: Strict Enforcement

At the heart of `py_aide` is the `@enforce_requirements` decorator. It's designed to prevent "sloppy" code by failing early if:

- A function is missing a docstring.
- A parameter or return value is missing a type hint.
- A function uses more than 8 arguments (promoting better decomposition).
- Calling conventions (`/` or `*`) are not explicitly defined.

```python
from py_aide.enforcer import enforce_requirements

@enforce_requirements
def create_user(name: str, age: int, /) -> dict:
    """Creates a new user dict."""
    return {"name": name, "age": age}
```

## 🌐 Unified Server Example (Flask)

```python
from py_aide.servers.flask import ServicePortal, GateConfig

portal = ServicePortal()

@portal.endpoint("/api/greet", gate=GateConfig(auth_required=False))
def greet(name: str, /) -> dict:
    """Returns a greeting message."""
    return {"message": f"Hello, {name}!"}

if __name__ == "__main__":
    portal.run(port=5000)
```

## 📡 Standardized HTTP Client (`requests`)

`py_aide` provides a resilient HTTP client designed to work across both Flask and FastAPI. It follows an **"Always Resolve"** philosophy: it never raises exceptions for network errors or HTTP failures. Instead, it returns a standardized `Response` object.

### Core Features:
- **Resilient**: Automatic retries with `exponential`, `linear`, or `fixed` backoff.
- **Secure**: Auto-injects Bearer tokens and sanitizes outgoing payloads.
- **Silent Mode**: Suppress internal framework logging for specific requests.
- **Middleware**: Register global `request` and `response` interceptors.
- **Dual-Mode**: Dedicated paths for Synchronous (Flask) and Asynchronous (FastAPI) logic.

### 1. Synchronous Usage (Flask)
Ideal for standard routes where you want simple, linear logic.

```python
from py_aide import send_request, RequestConfig

# Optional: Set global defaults
RequestConfig.base_url = "https://api.myapp.com"
RequestConfig.max_retries = 3

def get_data():
    # Parameters are now keyword-only for strictness
    res = send_request(
        method="GET", 
        endpoint="/external-data",
        headers={"X-Custom-Header": "Value"}, # Custom headers
        silent=True, # Suppress all framework logs for this request
        on_success=lambda data: print("Got data!"),
        on_error=lambda err: print(f"Error: {err}")
    )
    
    if res:
        print(f"Success: {res.data}")
    else:
        print(f"Failed: {res.log}") # Detailed error log
```

### 2. Asynchronous Usage (FastAPI)
Optimized for high-concurrency environments using `asyncio`.

```python
from py_aide import send_request_async

async def fetch_profile(user_token: str):
    # Automatically injects Bearer token
    res = await send_request_async(
        method="GET", 
        endpoint="/profile", 
        auth_token=user_token
    )
    return res.to_dict()
```

### 3. Interceptors (Middleware)
Globally modify requests before they leave or responses before they reach your logic.

```python
from py_aide import Interceptors

# Add a header to every outgoing request
Interceptors.add_request_hook(lambda data: {
    **data, 
    "headers": {**data["headers"], "X-Client-ID": "py-aide-v1"}
})

# Log or transform every response
def log_response(res):
    print(f"Response from {res.errorLogs.get('endpoint')}")
    return res

Interceptors.add_response_hook(log_response)
```

### 4. Authentication Strategies
`py_aide` handles common auth patterns out-of-the-box.

```python
# 1. Bearer Token (Mandatory auth_type)
send_request(method="GET", endpoint="/", auth_token="my-token", auth_type="bearer")

# 2. API Key (Uses X-API-Key header by default)
send_request(method="GET", endpoint="/", auth_token="sk_123", auth_type="api_key")

# 3. Basic Auth (Automatically Base64 encodes)
send_request(method="GET", endpoint="/", auth_token=("user", "pass"), auth_type="basic")

# Custom API Key Header
RequestConfig.api_key_header = "X-My-Custom-Auth"
```

## 📥 Persistent Buffering
`py_aide` provides two ways to offload work to the background. Choosing the right one is critical for performance and data integrity.

### 🏁 Choosing the Right Buffer
| Feature | `DatabaseBuffer` | `TaskBuffer` |
| :--- | :--- | :--- |
| **Best For** | Simple, high-frequency writes | Complex logic & transactions |
| **Connection** | Fresh connection per task | One connection for entire logic |
| **State** | No state between tasks | Supports `ATTACH DATABASE` & Temp Tables |
| **Flexibility** | Standard SQL methods only | Any Python code |

---

## 📦 Database Write Buffer (SQLite Lock Prevention)
For high-concurrency write operations, `py_aide` provides a `DatabaseBuffer`. It serializes writes to prevent "Database is Locked" errors.

```python
from py_aide import DatabaseBuffer

# 1. Initialize with your main DB path (Defaults to _py_aide_queues.db for storage)
buffer = DatabaseBuffer(main_db="app.db")

# 2. Start the background worker
buffer.start()

# 3. Queue writes (non-blocking)
buffer.insert(table="users", data=[{"name": "Alice"}])
buffer.update(table="users", columns=["name"], column_data=["Bob"], condition="id=?", condition_data=[1])

# 4. Graceful shutdown
buffer.stop()
```

> [!WARNING]
> **State Limitation**: `buffer.execute()` is for **single-statement** custom SQL. Because the background worker opens a fresh connection for every task, connection-level state (like `ATTACH DATABASE` or `PRAGMA` settings) will **not** persist between separate calls. For stateful or multi-step transactions, always use the **`TaskBuffer`**.

## ⚙️ Generic Task Buffer (Async Jobs)
The `TaskBuffer` allows you to run any arbitrary function in the background. Ideal for emails, webhooks, or file processing.

```python
from py_aide import TaskBuffer

# 1. Define your handlers
def send_email(payload):
    # logic to send email to payload['to']
    print(f"Email sent to {payload['to']}")

# 2. Initialize with handlers and 5 parallel workers (Defaults to _py_aide_queues.db)
tasks = TaskBuffer(
    handlers={"SEND_EMAIL": send_email},
    workers=5
)
tasks.start()

# 3. Push tasks from anywhere
tasks.push(task_type="SEND_EMAIL", data={"to": "user@example.com"})

# 💡 Pro Tip: Automatic Retries
# If a task fails, py_aide automatically retries it with 
# Exponential Backoff (1s, 2s, 4s, 8s...).

# 💡 Pro Tip: Maintenance
# Keep your queue database slim by purging old failed tasks:
tasks.cleanup(max_age_days=7)

# 💡 Pro Tip: Use TaskBuffer for Complex SQL Transactions
# Since you control the 'with Api' block, you can handle multiple tables
# or even ATTACH DATABASE which requires a persistent connection.
def sync_to_archive(payload):
    with Api(db_path="main.db", transactionMode=True) as db:
        db.execute(query="ATTACH DATABASE ? AS archive", data=["archive.db"])
        db.execute(query="INSERT INTO archive.logs SELECT * FROM main.logs WHERE id = ?", data=[payload['id']])
        return db.commit_transaction()

# 4. Graceful shutdown
tasks.stop()
```

### 🚨 Error Hooks & Monitoring
Since buffers run in the background, you can register an `on_error` hook to be notified of failures. This is ideal for logging **full Tracebacks** to your database or alerting systems.

```python
# The hook receives the task type, the data, the error message, and the Traceback
def my_error_logger(task_type, payload, error, traceback):
    if traceback:
        # 'traceback' contains the full Python stack trace string
        print(f"CRASH in {task_type}: {traceback}")
    else:
        print(f"LOGIC ERROR in {task_type}: {error}")

# Register the hook in the constructor
tasks = TaskBuffer(
    handlers={"EMAIL": send_email},
    on_error=my_error_logger
)
```

### 🧵 Global Hooks in ThreadPoolExecutor
The `ThreadPoolExecutor` also supports a global `on_error` hook to catch failures in parallel tasks.

```python
def global_handler(func_name, payload, error, traceback):
    print(f"Parallel task {func_name} failed: {error}")

with ThreadPoolExecutor(on_error=global_handler) as executor:
    executor.submit(func=heavy_task, args=[data])
```

### 🗓️ Scheduler Hooks
The scheduler functions (`run_once`, `run_every`, `run_at`) are strictly keyword-only and support `on_error` hooks.

```python
from py_aide.threading import run_every

def scheduler_error_handler(name, type, error, traceback):
    print(f"Scheduled task {name} ({type}) failed: {error}")

# Mandatory keyword arguments: interval_seconds and func
run_every(interval_seconds=3600, func=sync_data, on_error=scheduler_error_handler)
```

## 📦 Persistent Queuing
`py_aide` includes a robust, SQLite-backed persistent queue for tasks that must survive application restarts.

```python
from py_aide import PersistentQueue

# 1. Initialize (Defaults to _py_aide_queues.db)
queue = PersistentQueue(queue_name="email_tasks")

# 2. Push a task (Keyword arguments mandatory)
queue.push(payload={"to": "user@example.com", "body": "Hello!"}, priority=10)

# 3. Pop and Process
res = queue.pop()
if res:
    task = res.data[0]
    task_id = task["id"]
    payload = task["payload"]
    
    try:
        # Process task...
        queue.complete(task_id)
    except Exception as e:
        # Record failure with error message
        queue.fail(task_id=task_id, err=str(e), retry=True)

# 4. Peek without locking
upcoming = queue.peek(limit=5)
```

## ⚡ Real-Time Identity & Groups (WebSockets)

`py_aide` moves beyond anonymous broadcasts. It allows you to target users and groups directly using their business IDs (e.g., `userId`), handling the underlying session mapping automatically. This powerful API is unified across both **FastAPI** and **Flask**.

### Unified Delivery Methods
Both `FastAPIServicePortal` and `ServicePortal` (Flask) expose identical methods for targeted communication:

- `send_to_user(user_id, event, data)`: Reach all active sessions of a specific user.
- `send_to_group(group_id, event, data)`: Sync messages within a room or collaborative group.
- `broadcast(event, data)`: Send a message to every connected client.

### Flask Example
```python
from py_aide.servers.flask import ServicePortal
from py_aide.servers.gate import AuthType

portal = ServicePortal(enable_websocket=True)

@portal.on_event("join_team", auth_required=True, auth_type=AuthType.BEARER)
def handle_join(data: dict):
    """Adds the user to a collaborative group."""
    team_id = data.get("teamId")
    # Identity mapping is handled automatically upon successful event auth
    portal.join_group(team_id) 
    return {"status": "success", "team": team_id}

# Sending targeted messages from anywhere (even sync contexts)
def notify_user(user_id: str, message: str):
    portal.send_to_user(user_id, "notification", {"text": message})
```

### FastAPI Example
```python
from py_aide.servers.fastApi import FastAPIServicePortal

portal = FastAPIServicePortal(enable_websocket=True)

@portal.on_event("join_team", auth_required=True)
async def handle_join(data: dict, sid: str):
    team_id = data.get("teamId")
    await portal.join_group(sid, team_id)
    return {"status": "success"}

async def sync_team(team_id: str, update: dict):
    await portal.send_to_group(team_id, "team_update", update)
```

## 🛡️ Public API Security & Hardening

`py_aide` is built for external-facing APIs. It provides a robust, production-ready security pipeline that is unified across Flask and FastAPI.

### 🌐 Unified CORS Orchestration
CORS is handled identically on both frameworks. You can define allowed origins in the constructor to prevent unauthorized cross-origin requests.

```python
# Works for both ServicePortal (Flask) and FastAPIServicePortal
portal = ServicePortal(
    cors_origins=["https://myapp.com", "https://api.myapp.com"],
    debug=False
)
```

### 🚦 Rate Limiting (Atomic & Precise)
Protect your server from brute force and DoS attacks by enforcing request limits. `py_aide` uses atomic operations to prevent race conditions.

```python
from py_aide.servers.gate import GateConfig

@portal.endpoint("/api/search", gate=GateConfig(
    auth_required=False,
    rate_limit="ip",        # Options: "user", "ip", "endpoint"
    rate_limit_value=10,     # 10 requests
    rate_limit_window=60     # per 60 seconds
))
def search_api(query: str) -> dict:
    return {"results": []}
```

### 🔑 Fine-Grained Permissions (Scopes)
Move beyond simple Roles to granular, OAuth2-style Scopes. You can enforce multiple required scopes for sensitive endpoints.

```python
@portal.endpoint("/api/admin/delete", gate=GateConfig(
    auth_required=True,
    required_scopes=["admin:write", "system:purge"]
))
def delete_item() -> dict:
    """Only succeeds if the user's token has BOTH scopes."""
    return {"status": True}
```

### 🔄 Identity Symmetrics (Token Handover)
`py_aide` automatically handles the transformation between raw database IDs and masked session tokens. If you define a `payload_auth_token_key`, the framework will:
1. **Unmask** incoming tokens so your handler sees the raw ID (e.g. `1234`).
2. **Re-mask** the ID in the response so the client only sees the token (e.g. `tok_abc...`).

```python
@portal.endpoint("/login", gate=GateConfig(
    auth_required=False,
    payload_auth_token_key="userId" # The field to mask/unmask
))
def login_user() -> dict:
    # Framework sees 'userId': 1234
    # Client receives 'userId': 'eyJhbGciOiJIUzI1...'
    return {"status": True, "data": {"userId": 1234}}
```

### 🚪 Explicit Logout (Token Revocation)
`py_aide` supports immediate session invalidation. Revoked tokens are stored in a hybrid memory/SQLite store (`_py_aide_security.db`) and are rejected even if their signature is valid.

```python
@portal.endpoint("/logout", auth_required=True)
def logout() -> dict:
    # Get the raw token from the request
    token = request.headers.get("Authorization").split(" ")[1]
    
    # Invalidate immediately
    portal.revoke_token(token)
    
    return {"status": True, "message": "Logged out"}
```

## 🗄️ Database Management

`py_aide` provides a thread-local multiton pattern for SQLite, ensuring each thread has its own connection while sharing the same configuration.

```python
from py_aide.database import Api

db_config = {
    'users': 'id INTEGER PRIMARY KEY, name TEXT, meta JSON'
}

with Api(db_path="data.db", tables=db_config) as db:
    db.insert(table="users", data=[(1, "Alice", {"role": "admin"})])
    result = db.fetch(table="users", columns=["name", "meta::role as role"])
    print(result.data) # [{'name': 'Alice', 'role': 'admin'}]
```

## ⚠️ Important Note: Eventlet Monkey Patching

By default, importing `src.py_aide` (or the top-level package) **immediately** performs `eventlet.monkey_patch()`. This is required for reliable WebSocket support and some threading features. If you need to avoid this side-effect, ensure you understand the dependencies of your modules.

## 📄 License
This project is licensed under the **MIT License**.
See the [LICENSE](LICENSE) file for more details.

## 👥 Authors
- **Kakuru Douglas** - [vicaniddouglas@gmail.com](mailto:vicaniddouglas@gmail.com)
