Metadata-Version: 2.4
Name: opencode-memory
Version: 0.1.5
Summary: A Python MCP stdio instance for adding memory to AI coding agents
Author: Anomalo
License: MIT
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Python: >=3.10
Requires-Dist: faiss-cpu>=1.7.0
Requires-Dist: mcp>=1.6.0
Requires-Dist: mem0ai>=1.0.4
Requires-Dist: pydantic>=2.0.0
Provides-Extra: dev
Requires-Dist: pytest-asyncio>=0.21.0; extra == 'dev'
Requires-Dist: pytest-cov>=4.0.0; extra == 'dev'
Requires-Dist: pytest>=7.0.0; extra == 'dev'
Requires-Dist: ruff>=0.1.0; extra == 'dev'
Description-Content-Type: text/markdown

# OpenCode Memory

Give your AI coding agent a persistent memory across sessions.

## What It Does

OpenCode Memory enables AI agents to remember information between conversations:

- **Remember your preferences** - Coding style, editor settings, workflow preferences
- **Recall project context** - Architecture decisions, tech stack, coding conventions
- **Track ongoing work** - Current focus areas, recent decisions, sprint goals
- **Archive discussions** - Save important conversations for future reference

## Why It Matters

Without memory, AI assistants start fresh every session. You repeat yourself constantly. Context is lost. Decisions get forgotten.

OpenCode Memory solves this by providing persistent, searchable storage that your AI agent can access in every conversation.

## Key Features

| Feature | Benefit |
|---------|---------|
| **Semantic Search** | Find memories by meaning, not just keywords |
| **Categorized Storage** | Organize memories with tags for easy filtering |
| **Conversation Archive** | Store full discussions as readable markdown files |
| **Memory Expiration** | Auto-expire time-sensitive information |
| **Zero LLM Dependency** | No extra AI costs - your agent handles intelligence |

## Quick Start

### 1. Install

Using uv (recommended):

```bash
uv tool install opencode-memory
```

Or run directly without installing:

```bash
uvx opencode-memory --stdio
```

Using pip:

```bash
pip install opencode-memory
```

### 2. Configure Environment

```bash
export OPENAI_API_KEY="sk-your-api-key"
export OPENCODE_MEM_HISTORY_DIRECTORY="$HOME/.opencode/memory/history"
export OPENCODE_MEM_FAISS_DIRECTORY="$HOME/.opencode/memory/faiss"
```

### 3. Add to Your AI Agent

For opencode, add to your MCP configuration:

**Using uv (recommended):**

```json
{
  "mcpServers": {
    "memory": {
      "command": "uvx",
      "args": ["opencode-memory", "--stdio"],
      "env": {
        "OPENAI_API_KEY": "sk-your-api-key",
        "OPENCODE_MEM_HISTORY_DIRECTORY": "$HOME/.opencode/memory/history",
        "OPENCODE_MEM_FAISS_DIRECTORY": "$HOME/.opencode/memory/faiss"
      }
    }
  }
}
```

**Using pip:**

```json
{
  "mcpServers": {
    "memory": {
      "command": "opencode-memory",
      "args": ["--stdio"],
      "env": {
        "OPENAI_API_KEY": "sk-your-api-key",
        "OPENCODE_MEM_HISTORY_DIRECTORY": "$HOME/.opencode/memory/history",
        "OPENCODE_MEM_FAISS_DIRECTORY": "$HOME/.opencode/memory/faiss"
      }
    }
  }
}
```

That's it. Your AI agent now has memory.

## How It Works

### Two Storage Layers

1. **Memory Layer** - Semantic storage for facts, preferences, and insights
   - Vector-based similarity search
   - Filter by categories and metadata
   - Automatic expiration for time-sensitive info

2. **Conversation Layer** - Markdown files for complete discussion history
   - Human-readable format
   - Searchable with standard tools
   - Version control friendly

### No Built-in LLM

This is intentional. Your AI agent already has LLM capabilities. OpenCode Memory handles only storage and retrieval, keeping things simple and cost-effective.

## Usage Patterns

### Preference Memory

When you mention a preference, your AI agent stores it:

```
You: "I always use tabs for indentation"

AI: [Stores memory: "User prefers tabs for indentation"]
    [Categories: preferences, formatting]
```

Future sessions will remember and apply this preference automatically.

### Project Context

Your AI agent maintains awareness of your project:

```
AI: [Stores memory: "Project uses FastAPI + PostgreSQL + Redis"]
    [Categories: project-context, tech-stack]
    [Metadata: {project: "my-app"}]
```

When you start a new session, the AI recalls relevant context.

### Current Focus

Track what you're working on:

```
AI: [Stores memory: "Currently implementing OAuth2 authentication"]
    [Categories: current-focus]
    [Expires: end of sprint]
```

### Conversation Archive

Important discussions are saved:

```
AI: [Stores conversation: "Database Schema Design"]
    [Contains: All messages about schema decisions]
    [Saved as: Database_Schema_Design_20240115_103000.md]
```

## Configuration

### Required Variables

| Variable | Description |
|----------|-------------|
| `OPENAI_API_KEY` | For embedding generation |
| `OPENCODE_MEM_HISTORY_DIRECTORY` | Where to store files |
| `OPENCODE_MEM_FAISS_DIRECTORY` | Where to store vector index |

### Optional Variables

| Variable | Default | Description |
|----------|---------|-------------|
| `OPENCODE_MEM_EMBEDDING_MODEL` | `text-embedding-3-large` | OpenAI embedding model |
| `OPENCODE_MEM_EMBEDDING_PROVIDER` | `openai` | Embedding provider |
| `OPENCODE_MEM_EMBEDDING_BASE_URL` | - | Custom API endpoint |

## Documentation

- **[API.md](API.md)** - Complete tool reference with examples
- **[DESIGN.md](DESIGN.md)** - Architecture decisions and rationale

## Requirements

- Python 3.10+
- OpenAI API key

## Development

```bash
# Clone and install
git clone https://github.com/anomalyco/opencode-memory.git
cd opencode-memory
pip install -e ".[dev]"

# Run tests
pytest

# Run linter
ruff check .
```

## License

MIT
