Metadata-Version: 2.4
Name: mlx-code
Version: 0.0.3
Summary: Coding Agent for Mac
Home-page: https://github.com/JosefAlbers/mlx-code
Author: J Joe
Author-email: albersj66@gmail.com
License: Apache-2.0
Requires-Python: >=3.12.8
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: mlx-lm>=0.31.3
Requires-Dist: numpy
Requires-Dist: httpx
Requires-Dist: pydantic
Dynamic: author
Dynamic: author-email
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: license
Dynamic: license-file
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# mlx-code

[![Link](https://raw.githubusercontent.com/JosefAlbers/mlx-code/main/assets/mlx-code.gif)](https://youtu.be/bizPhrHL1_w)

A lightweight, powerful Coding Agent for Mac. Built on top of Apple's MLX framework, `mlx-code` provides fast, local inference with built-in prompt caching and robust tool-calling capabilities. 

It features a multi-provider local server, a terminal-based chat REPL, and a dedicated TUI for inspecting logs.

### Features

*   **Local MLX Inference**: Powered by `mlx-lm` for optimized performance on Apple Silicon. Includes intelligent prompt caching.
*   **Multi-Provider Compatibility**: Seamlessly translates and handles requests formatted for Claude, Gemini, Codex, and standard OpenAI APIs. 
*   **Built-in REPL & Tools**: Comes with `pie`, a fully-featured chat REPL with tool execution and reasoning token support.
*   **TUI Log Viewer**: Includes a Curses-based Terminal UI for filtering, inspecting, and tracking JSON logs in real-time.
*   **Server Mode**: Easily spin up a local server compatible with standard LLM tooling.

### Quick Start

Install via pip and launch the agent immediately:

```bash
pip install mlx-code
mc
```

### Command Line Interfaces

The package installs three primary command-line tools:
- mc (Main Agent/Server): Runs the core agent and local API server (defaults to 127.0.0.1:8000).
- me (REPL): Launches the interactive pie chat REPL.
- md (Diagnostics/Logs): Opens the TUI viewer to navigate and filter JSON logs generated by the agent.

### Options

You can customize the model, server, and behavior using command-line flags.

```bash
# Use Gemini CLI as the harness
mc --harness gemini 

# Use a custom local LLM backend
mc --model mlx-community/Qwen3.5-4B-OptiQ-4bit

# Use DeepSeek V4 Flash API
me --deepseek

# Run the server only on a custom port
mc --nocc --port 8080
```
*(For a full list of mc server arguments, run mc --help)*

### Credits

- `main.py`: Built on [MLX](https://github.com/ml-explore/mlx) and [MLX LM](https://github.com/ml-explore/mlx-lm) by Apple.
- `pie.py`: Adapted from [pi](https://github.com/badlogic/pi-mono) by Mario Zechner (MIT License).

### Licence

Apache License 2.0 — see LICENSE for details.
