Metadata-Version: 2.4
Name: mtlearn
Version: 1.0.6
Summary: Morphological tree learning utilities with a stable morphology facade
Author-Email: Wonder Alexandre Luz Alves <worderalexandre@gmail.com>
License-Expression: GPL-3.0-only
License-File: LICENSE
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Classifier: Programming Language :: C++
Classifier: Topic :: Scientific/Engineering :: Image Processing
Classifier: Operating System :: MacOS
Classifier: Operating System :: Microsoft :: Windows
Classifier: Operating System :: POSIX :: Linux
Project-URL: Source, https://github.com/wonderalexandre/MTLearn
Project-URL: Issues, https://github.com/wonderalexandre/MTLearn/issues
Requires-Python: >=3.9
Requires-Dist: numpy>=1.23
Requires-Dist: opencv-python-headless>=4.8
Requires-Dist: torch<2.3,>=2.2.2; sys_platform == "darwin" and platform_machine == "x86_64" and python_version < "3.13"
Requires-Dist: torch<2.9,>=2.8; sys_platform == "darwin" and platform_machine == "arm64" and python_version < "3.10"
Requires-Dist: torch<2.12,>=2.10; sys_platform == "darwin" and platform_machine == "arm64" and python_version >= "3.10" and python_version < "3.14"
Requires-Dist: torch<2.12,>=2.11; sys_platform == "darwin" and platform_machine == "arm64" and python_version >= "3.14"
Requires-Dist: torch<2.9,>=2.8; sys_platform != "darwin" and python_version < "3.10"
Requires-Dist: torch<2.12,>=2.10; sys_platform != "darwin" and python_version >= "3.10" and python_version < "3.14"
Requires-Dist: torch<2.12,>=2.11; sys_platform != "darwin" and python_version >= "3.14"
Provides-Extra: notebooks
Requires-Dist: ipykernel; extra == "notebooks"
Requires-Dist: matplotlib; extra == "notebooks"
Requires-Dist: nbformat>=5; extra == "notebooks"
Requires-Dist: papermill>=2.4; extra == "notebooks"
Provides-Extra: test
Requires-Dist: pytest>=8; extra == "test"
Description-Content-Type: text/markdown

# MTLearn

[![CI](https://github.com/wonderalexandre/MTLearn/actions/workflows/ci.yml/badge.svg)](https://github.com/wonderalexandre/MTLearn/actions/workflows/ci.yml)
[![Package](https://github.com/wonderalexandre/MTLearn/actions/workflows/package.yml/badge.svg)](https://github.com/wonderalexandre/MTLearn/actions/workflows/package.yml)

**MTLearn** (*Morphological Tree Learning*) is a C++/Python research library for
learnable connected operators based on morphological trees. The Python package
is published as `mtlearn`.

The library explores a simple idea: connected morphology can become a structural
prior for deep neural networks. Instead of processing images only through local
pixel-wise operations, connected operators reason over components, regions,
shape, contrast, and hierarchy. This makes them naturally interpretable and
well-suited for tasks where structure matters.

Classical connected filters are powerful, but they usually depend on hard
keep/discard decisions and manually selected attribute thresholds. This limits
their integration into end-to-end trainable neural architectures.

**MTLearn** provides a stable implementation platform for this research direction.
It currently includes Connected Filter Preprocessing (CFP), and is intended to
grow toward trainable connected-operator layers, differentiable or learnable
attribute criteria, self-dual tree representations, intermediate network
insertions, and scalable implementations.

## Main Features

- **Connected Filter Preprocessing (CFP):** the current main model, available as
  `mtlearn.layers.ConnectedFilterPreprocessingLayer`. CFP replaces hard
  connected-filter decisions with a differentiable sigmoid gate over normalized
  tree-node attributes.

- **Stable morphology interface:** `mtlearn.morphology` builds max-trees,
  min-trees, and tree-of-shapes through a backend-independent API.

- **Trainable connected morphology:** designed as an implementation platform for
  connected morphology as a learnable structural prior in deep neural networks.

- **Research-ready validation:** includes C++ tests, Python tests, gradient
  checks, reference implementations, notebook validations, and public dataset
  download helpers.

## Install

The Python package is available from PyPI as `mtlearn`:

```bash
pip install mtlearn
```

See [docs/installation.md](docs/installation.md) for installation instructions
and [docs/development.md](docs/development.md) for source builds, validation,
and releases.

## Quick Start

Build a morphology tree and compute attributes:

```python
import numpy as np
from mtlearn import morphology

image = np.array([[1, 2], [3, 4]], dtype=np.uint8)
tree = morphology.create_max_tree(image)

_, attributes = morphology.compute_attributes(
    tree,
    [morphology.AttributeType.AREA, morphology.AttributeType.COMPACTNESS],
)

print(attributes.shape)
```

Create a CFP layer and run a forward pass:

```python
import torch
from mtlearn import morphology
from mtlearn.layers import ConnectedFilterPreprocessingLayer

cfp_layer = ConnectedFilterPreprocessingLayer(
    in_channels=1,
    attributes_spec=[(
        morphology.AttributeType.AREA,
        morphology.AttributeType.CIRCULARITY,
    )],
    tree_type="max-tree",
    device="cpu",
)

x = torch.tensor([[[[1, 2], [3, 4]]]], dtype=torch.float32)
y = cfp_layer(x)

assert y.shape == x.shape
```

## Examples and Notebooks

Executable examples are available in `notebooks/`.

Install notebook dependencies with:

```bash
pip install "mtlearn[notebooks]"
```

The main public experiment example is:

```text
notebooks/experiments/Example_screws_filtering.ipynb
```

## Implementation Notes

`ConnectedFilterPreprocessingLayer` is the recommended implementation for new
CFP experiments.

Tensor operations, trainable parameters, and cached attributes can live on CUDA
when `device="cuda"`. Morphology-tree construction is still performed by the
C++ backend on CPU.

The main implementation uses an implicit Jacobian formulation. The dense
region-pixel matrix is not materialized during normal training; tree-ordering
metadata is used to perform the equivalent reconstruction and backward
accumulation more compactly. This reduces memory pressure compared with
explicit region-pixel Jacobian construction.

Reference implementations based on explicit Jacobians and CPU tree traversals
remain available for gradient checks, comparisons, and debugging.

**MTLearn** uses a C++ morphology backend internally through `mtlearn::morphology`.
User code should interact with morphology through the public Python facade
`mtlearn.morphology`, rather than depending on backend-specific APIs.

The backend is
[`MorphologicalAttributeFilters`](https://github.com/wonderalexandre/MorphologicalAttributeFilters)
/ `mmcfilters`, but the top-level Python package `mmcfilters` is not required
as a runtime dependency of `mtlearn`.

## Current Scope

**MTLearn** is a research-oriented library. CFP is the first validated member of
a broader planned family of trainable connected-operator layers. The current
implementation supports max-tree and min-tree CFP workflows, multi-attribute
groups, dataset-level attribute normalization, cached preprocessing, and
PyTorch forward/backward for CFP parameters on CPU or CUDA tensors.

## Citation

If you use the CFP layer in your work, please cite:

> Wonder A. L. Alves, Lucas de P. O. Santos, Ronaldo F. Hashimoto, Nicolas Passat, Anderson H. R. Souza, Dennis J. Silva, Yukiko Kenmochi. **A trainable connected filter preprocessing layer based on component trees.** International Conference on Pattern Recognition (ICPR), 2026, Lyon, France. ⟨[hal-05575141](https://hal.science/hal-05575141/)⟩
