Metadata-Version: 2.4
Name: getmnemo-langchain
Version: 0.1.0
Summary: LangChain adapter for Mnemo — drop-in BaseMemory and Retriever backed by the Mnemo SDK.
Project-URL: Homepage, https://github.com/getmnemo/getmnemo-langchain
Project-URL: Issues, https://github.com/getmnemo/getmnemo-langchain/issues
Project-URL: Source, https://github.com/getmnemo/getmnemo-langchain
Author-email: Mnemo <founders@getmnemo.xyz>
License: MIT
License-File: LICENSE
Keywords: getmnemo,langchain,llm,memory,rag,retriever
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Requires-Python: >=3.10
Requires-Dist: getmnemo>=0.1
Requires-Dist: langchain-core<1.0,>=0.3
Provides-Extra: dev
Requires-Dist: mypy>=1.10; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.23; extra == 'dev'
Requires-Dist: pytest>=7; extra == 'dev'
Requires-Dist: ruff>=0.5; extra == 'dev'
Description-Content-Type: text/markdown

# ledgermem-langchain

LangChain adapter for [LedgerMem](https://github.com/ledgermem/ledgermem-python) — drop-in conversational memory and retriever backed by the LedgerMem SDK.

## Install

```bash
pip install ledgermem-langchain
```

## Quickstart

```python
from langchain.chains import ConversationChain
from langchain_openai import ChatOpenAI
from ledgermem import LedgerMem
from langchain_ledgermem import LedgerMemMemory

mem_client = LedgerMem(api_key="lm_...", workspace_id="ws_...")
chain = ConversationChain(
    llm=ChatOpenAI(model="gpt-4o-mini"),
    memory=LedgerMemMemory(client=mem_client, top_k=5),
)

print(chain.invoke({"input": "Remember that my favourite framework is FastAPI."}))
print(chain.invoke({"input": "What is my favourite framework?"}))
```

## Retriever for RAG

```python
from langchain_ledgermem import LedgerMemRetriever

retriever = LedgerMemRetriever(client=mem_client, top_k=8)
docs = retriever.invoke("show me what I said about deployments")
for doc in docs:
    print(doc.metadata.get("score"), doc.page_content)
```

## What you get

- `LedgerMemMemory` — implements `langchain_core.memory.BaseMemory`. Auto-saves every user / AI turn and rehydrates the top-K relevant prior turns by semantic search.
- `LedgerMemRetriever` — implements `langchain_core.retrievers.BaseRetriever`. Returns LangChain `Document` objects with score and `memory_id` in metadata.

## License

MIT — see [LICENSE](./LICENSE).
