Metadata-Version: 2.4
Name: neuralnode
Version: 1.0.0
Summary: A comprehensive Python framework for building AI agents with support for both cloud-based LLM APIs and local models
Project-URL: Homepage, https://github.com/neuralnode/neuralnode
Project-URL: Documentation, https://neuralnode.readthedocs.io
Project-URL: Repository, https://github.com/neuralnode/neuralnode
Project-URL: Issues, https://github.com/neuralnode/neuralnode/issues
Author-email: Assem Sabry <assemsabry@example.com>
Maintainer-email: Assem Sabry <assemsabry@example.com>
License: MIT
License-File: LICENSE
Keywords: agent,ai,anthropic,framework,langchain-alternative,llm,ollama,openai,rag,vector-search
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.8
Requires-Dist: aiohttp>=3.8.0
Requires-Dist: pydantic>=2.0.0
Requires-Dist: requests>=2.28.0
Requires-Dist: tenacity>=8.2.0
Requires-Dist: typing-extensions>=4.5.0
Provides-Extra: all
Requires-Dist: accelerate>=0.24.0; extra == 'all'
Requires-Dist: anthropic>=0.18.0; extra == 'all'
Requires-Dist: beautifulsoup4>=4.12.0; extra == 'all'
Requires-Dist: faiss-cpu>=1.7.0; extra == 'all'
Requires-Dist: google-generativeai>=0.3.0; extra == 'all'
Requires-Dist: llama-cpp-python>=0.2.0; extra == 'all'
Requires-Dist: numpy>=1.24.0; extra == 'all'
Requires-Dist: ollama>=0.1.0; extra == 'all'
Requires-Dist: openai>=1.0.0; extra == 'all'
Requires-Dist: playwright>=1.40.0; extra == 'all'
Requires-Dist: pypdf>=3.17.0; extra == 'all'
Requires-Dist: selenium>=4.15.0; extra == 'all'
Requires-Dist: torch>=2.0.0; extra == 'all'
Requires-Dist: transformers>=4.35.0; extra == 'all'
Provides-Extra: anthropic
Requires-Dist: anthropic>=0.18.0; extra == 'anthropic'
Provides-Extra: browser
Requires-Dist: playwright>=1.40.0; extra == 'browser'
Requires-Dist: selenium>=4.15.0; extra == 'browser'
Provides-Extra: dev
Requires-Dist: black>=23.0.0; extra == 'dev'
Requires-Dist: mypy>=1.7.0; extra == 'dev'
Requires-Dist: pre-commit>=3.5.0; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.21.0; extra == 'dev'
Requires-Dist: pytest>=7.4.0; extra == 'dev'
Requires-Dist: ruff>=0.1.0; extra == 'dev'
Provides-Extra: google
Requires-Dist: google-generativeai>=0.3.0; extra == 'google'
Provides-Extra: huggingface
Requires-Dist: accelerate>=0.24.0; extra == 'huggingface'
Requires-Dist: torch>=2.0.0; extra == 'huggingface'
Requires-Dist: transformers>=4.35.0; extra == 'huggingface'
Provides-Extra: local
Requires-Dist: llama-cpp-python>=0.2.0; extra == 'local'
Requires-Dist: torch>=2.0.0; extra == 'local'
Requires-Dist: transformers>=4.35.0; extra == 'local'
Provides-Extra: ollama
Requires-Dist: ollama>=0.1.0; extra == 'ollama'
Provides-Extra: openai
Requires-Dist: openai>=1.0.0; extra == 'openai'
Provides-Extra: rag
Requires-Dist: beautifulsoup4>=4.12.0; extra == 'rag'
Requires-Dist: faiss-cpu>=1.7.0; extra == 'rag'
Requires-Dist: numpy>=1.24.0; extra == 'rag'
Requires-Dist: pypdf>=3.17.0; extra == 'rag'
Description-Content-Type: text/markdown

# NeuralNode

A next-generation Python framework for building, running, and managing **Large Language Models (LLMs)** and AI Agents, locally or via cloud providers.

> **The smarter, simpler, and more powerful alternative to LangChain.**

## Features

- **Unified LLM Interface** - One API for all LLMs (local or cloud)
- **Multi-Provider Support** - OpenAI, Anthropic, Google, Ollama, HuggingFace
- **Streaming & Async** - Real-time responses with full async support
- **Function Calling** - Native tool use with structured outputs
- **Local AI Engine** - Auto-download, quantization, hardware optimization
- **AI Agents** - Convert any LLM into an autonomous agent
- **Browser Integration** - Chrome/Edge automation built-in
- **RAG Built-in** - PDFs, documents, vector stores, semantic search
- **Real-Time Context** - Automatic time/date injection
- **Memory Management** - Conversation history, context-aware chaining

## Quick Start

```bash
pip install neuralnode
```

```python
from neuralnode import NeuralNode

# Auto-detect best provider
ai = NeuralNode(provider="auto")

# Simple chat
response = ai.chat("Explain quantum computing")
print(response.text)

# With streaming
for chunk in ai.chat("Tell me a story", stream=True):
    print(chunk.text, end="")

# Async support
response = await ai.achat("Hello!")

# Create an agent
agent = ai.agent()
result = agent.run("Find the latest news about AI")
```

## Installation

```bash
# Base install
pip install neuralnode

# With specific providers
pip install neuralnode[openai,anthropic]

# Full install with all features
pip install neuralnode[all]
```

## Architecture

```
neuralnode/
├── core/           # Base interfaces and types
├── providers/      # LLM provider implementations
├── agents/         # Agent system and orchestration
├── tools/          # Tool integrations (browser, APIs)
├── rag/            # Retrieval-Augmented Generation
├── memory/         # Conversation and context management
├── local/          # Local model runtime and hardware
└── utils/          # Utilities and helpers
```

## License

MIT License - see LICENSE file for details.
