Metadata-Version: 2.4
Name: pyautoencoder
Version: 1.0.2
Summary: A Python package offering implementations of state-of-the-art autoencoder architectures in PyTorch.
Home-page: https://github.com/andrea-pollastro/pyautoencoder
Author: Andrea Pollastro
License: MIT
Keywords: autoencoder,vae,pytorch,deep learning,machine learning,representation learning,dimensionality reduction,generative models
Classifier: Operating System :: OS Independent
Classifier: Intended Audience :: Science/Research
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.7
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: torch>=2.0.0
Dynamic: author
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: keywords
Dynamic: license
Dynamic: license-file
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

![logo](https://raw.githubusercontent.com/andrea-pollastro/pyautoencoder/main/assets/logo_nobackground.png)

**pyautoencoder** is a lightweight Python package offering clean, minimal implementations of foundational autoencoder architectures in PyTorch. 
It is designed for researchers, educators, and practitioners seeking a reliable base for experimentation, extension, or instruction.

## 📦 Installation

```bash
pip install pyautoencoder
```

Or install from source:
```bash
git clone https://github.com/andrea-pollastro/pyautoencoder.git
cd pyautoencoder
pip install -e .
```

## 🚀 Quick Example

```python
import torch
from pyautoencoder.models import Autoencoder

# Define encoder and decoder
encoder = torch.nn.Sequential(
    torch.nn.Flatten(),
    torch.nn.Linear(784, 32)
)

decoder = torch.nn.Sequential(
    torch.nn.Linear(32, 784),
    torch.nn.Unflatten(1, (1, 28, 28))
)

# Initialize model
model = Autoencoder(encoder, decoder)

# Forward pass
x = torch.randn(64, 1, 28, 28)
x_hat, z = model(x)
```

## 🗺️ Roadmap
- [x] Autoencoder (AE)
- [x] Variational Autoencoder (VAE)
- [ ] Hierarchical VAE (HVAE)
- [ ] Importance-Weighted AE (IWAE)
- [ ] Denoising Autoencoder (DAE)
- [ ] Sparse Autoencoder (SAE)

## 🤝 Contributing
Contributions are welcome — especially new autoencoder variants, training examples, and documentation improvements.
Please open an issue or pull request to discuss any changes.

## 📝 Citing
```bibtex
@misc{pollastro2025pyautoencoder,
  Author = {Andrea Pollastro},
  Title = {pyautoencoder},
  Year = {2025},
  Publisher = {GitHub},
  Journal = {GitHub repository},
  Howpublished = {\url{https://github.com/andrea-pollastro/pyautoencoder}}
}
```

## 📄 License
This project is licensed under the MIT License. See the LICENSE file for details.
