Metadata-Version: 2.4
Name: opencode-memory
Version: 0.2
Summary: A Python MCP stdio instance for adding memory to AI coding agents
Author: Anomalo
License: MIT
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Requires-Python: >=3.10
Requires-Dist: faiss-cpu>=1.7.0
Requires-Dist: mcp>=1.6.0
Requires-Dist: mem0ai>=1.0.4
Requires-Dist: pydantic>=2.0.0
Provides-Extra: dev
Requires-Dist: pytest-asyncio>=0.21.0; extra == 'dev'
Requires-Dist: pytest-cov>=4.0.0; extra == 'dev'
Requires-Dist: pytest>=7.0.0; extra == 'dev'
Requires-Dist: ruff>=0.1.0; extra == 'dev'
Description-Content-Type: text/markdown

# OpenCode Memory

Give your AI coding agent persistent memory and conversation archiving.

## What It Does

OpenCode Memory provides two capabilities to AI coding agents like opencode:

1. **Memory** - Store and retrieve facts, preferences, and context across sessions
2. **Conversation Archiving** - Save full conversations as readable markdown files

Your AI agent remembers who you are, what you're working on, and can browse past discussions.

## Why It Matters

Without memory:
- You repeat your preferences every session
- Project context is lost between conversations
- Important decisions get forgotten
- Past discussions are inaccessible

OpenCode Memory solves this by providing persistent storage your AI agent can access in every conversation.

## Installation

```bash
uv tool install opencode-memory
```

Or run directly without installing:

```bash
uvx opencode-memory --stdio
```

## Configure for OpenCode

Add to your MCP configuration file (typically `~/.config/opencode/config.json`):

```json
{
  "mcpServers": {
    "memory": {
      "command": "uvx",
      "args": ["opencode-memory", "--stdio"],
      "env": {
        "OPENAI_API_KEY": "sk-your-api-key",
        "OPENCODE_MEM_FAISS_DIRECTORY": "$HOME/.opencode/memory/faiss"
      }
    }
  }
}
```

### Environment Variables

| Variable | Required | Description |
|----------|----------|-------------|
| `OPENAI_API_KEY` | Yes | For embedding generation |
| `OPENCODE_MEM_FAISS_DIRECTORY` | Yes | Where vector index is stored |
| `OPENCODE_MEM_EMBEDDING_MODEL` | No | Default: `text-embedding-3-large` |

## Usage Recommendation: Dedicated Chat Agent

**Recommended approach:** Create a dedicated opencode agent that acts as a ChatGPT-like interface.

### Why a Dedicated Agent?

When you use opencode primarily for conversation rather than code editing:

1. **All conversations are archived** - Every discussion saved as markdown
2. **Memories accumulate** - Preferences, context, and knowledge build over time
3. **Historical search** - Browse and search past conversations by topic
4. **Session continuity** - Pick up where you left off in previous sessions

### How to Create a Dedicated Agent

Create the agent file at `~/.config/opencode/agent/Chat.md`:

```markdown
# Chat Agent

A conversational agent with memory and conversation archiving capabilities.

## Behavior

You are a helpful conversational assistant, similar to ChatGPT. Your role is to engage in natural conversation while maintaining persistent memory and archiving all interactions.

## Memory

You have access to memory tools. Use them to:
- Store user preferences (coding style, tools, workflows)
- Remember important information the user shares
- Track ongoing topics and interests
- Save useful context for future sessions

Before responding, search your memory for relevant context. Store meaningful facts after conversations.

## Conversation Archiving

**CRITICAL: Archive every interaction using opencode's native file tools.**

After each user request followed by your response (one interaction):

1. Use the `write` or `edit` tool to append the interaction to a markdown file named `YYYY-MM-DD.md` in a directory of your choice (e.g., `~/.opencode/memory/history/2026-02-20.md`)

2. The file must start with a keywords header:

```markdown
keywords: python, async, asyncio, error-handling
---

## User
[timestamp] Your question here...

## Assistant
[timestamp] Your response here...
```

3. Keywords act as hashtags for finding conversations. They should:
   - Be broad enough to categorize the topic
   - Include technologies, concepts, or themes discussed
   - Be updated after each interaction if new topics emerge

4. Append new interactions to today's file using the `edit` tool. Update the keywords line if the conversation covers new topics.

## Example

After discussing Python async functions:

```markdown
keywords: python, async, asyncio, error-handling, concurrency

## User
[2026-02-20 14:30] How do I handle exceptions in asyncio.gather?

## Assistant
[2026-02-20 14:30] You can use return_exceptions=True parameter...

## User
[2026-02-20 14:35] What about timeout handling?

## Assistant
[2026-02-20 14:35] For timeouts, use asyncio.wait_for()...
```

## Workflow

For each message:
1. Search memory for relevant context
2. Respond naturally to the user
3. Use `write` or `edit` tools to archive the interaction to today's markdown file
4. Update keywords if new topics emerged
5. Store any important facts as memories
```

### Using the Agent

```bash
mkdir -p ~/chat
cd ~/chat
opencode --agent Chat
```

The agent will automatically:
- Archive every interaction to daily markdown files
- Build a searchable conversation history
- Remember your preferences and context
- Update keywords as topics evolve

### What You Get

With a dedicated memory-enabled agent:

- **Preferences remembered** - "I use dark theme" is stored and recalled
- **Project context** - "Working on FastAPI backend" persists across sessions
- **Conversation history** - Browse `~/.opencode/memory/history/` for past discussions
- **Topic search** - Use opencode's grep to find conversations about specific topics

## How Memory Works

### Two Storage Layers

1. **Memory Layer** - Semantic storage for facts and knowledge
   - Vector-based similarity search
   - Filter by categories and metadata
   - Automatic expiration for time-sensitive info

2. **Conversation Layer** - Markdown files for complete discussion history
   - Human-readable format
   - Searchable with standard tools (grep, opencode search)
   - Version control friendly

### Available Tools

Your AI agent has access to MCP tools

| Tool | Purpose |
|------|---------|
| `add_memory` | Store a fact or preference |
| `search_memory` | Find relevant memories by meaning |
| `get_all_memories` | Retrieve all stored memories |
| `update_memory` | Modify an existing memory |
| `delete_memory` | Remove a memory |

See [API.md](API.md) for detailed tool documentation.

## Example Usage Patterns

### Remembering Preferences

```
You: "I always use 2-space indentation for Python"

AI: [Stores: "User prefers 2-space indentation for Python" 
     Categories: preferences, python]
```

Future sessions recall this automatically.

### Tracking Current Work

```
AI: [Stores: "Currently implementing OAuth2 authentication"
     Categories: current-focus
     Expires: end of sprint]
```

### Archiving Discussions

Use opencode's `write` or `edit` tools to save conversations:

```
AI: [Uses write tool to create: 2026-02-20.md]
     Content: keywords, timestamps, and full conversation
```

Later, opencode can grep through archived conversations:

```bash
# Find conversations about database design
opencode grep "database" ~/.opencode/memory/history/
```

## Documentation

- **[API.md](API.md)** - Complete MCP tool reference with examples
- **[DESIGN.md](DESIGN.md)** - Architecture decisions and rationale

## Requirements

- Python 3.10+
- OpenAI API key (for embeddings)

## License

MIT
