Metadata-Version: 2.4
Name: dropwise
Version: 0.1.1
Summary: Monte Carlo Dropout-based uncertainty estimation for Transformers
Home-page: https://github.com/aryanator/dropwise
Author: Aryan Patil
Author-email: aryanator01@gmail.com
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.7
Description-Content-Type: text/markdown
License-File: LICENCE
Requires-Dist: torch
Requires-Dist: transformers
Dynamic: author
Dynamic: author-email
Dynamic: classifier
Dynamic: description
Dynamic: description-content-type
Dynamic: home-page
Dynamic: license-file
Dynamic: requires-dist
Dynamic: requires-python
Dynamic: summary

# Dropwise

**Dropwise** is a lightweight PyTorch/HuggingFace wrapper for performing Monte Carlo Dropout–based uncertainty estimation in Transformers. It enables confidence-aware decision making by revealing how certain a model is about its predictions — all with just a few lines of code.

---

## 🚀 Features

- ✅ Enable dropout during inference for **Bayesian-like uncertainty** estimation
- ✅ Compute **predictive entropy**, **confidence**, and **per-class standard deviation**
- ✅ Works seamlessly with **Hugging Face Transformers** and **PyTorch**
- ✅ Supports **batch inference**, **CPU/GPU**, and customizable `num_passes`
- ✅ Cleanly packaged and extensible for research or production

---

## 🤖 Supported Models

Dropwise works with any `AutoModelForSequenceClassification` compatible model including:

- `bert-base-uncased`, `bert-large-uncased`
- `roberta-base`, `roberta-large`
- `deberta-v3-base`, `deberta-v2-xlarge`
- `albert-base-v2`, `distilbert-base-uncased`
- Any custom fine-tuned Hugging Face model for classification

> ⚠️ Note: The model must contain dropout layers (most pretrained transformers do).

---

## 📦 Installation

```bash
pip install dropwise  
```

Or install locally for development:

```bash
git clone https://github.com/aryanator01/dropwise.git
cd dropwise
pip install -e .
```

---

## 🧠 Example Usage

```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
from dropwise import DropwisePredictor

model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased")
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")

predictor = DropwisePredictor(model, tokenizer, num_passes=30)
result = predictor.predict("The performance was unexpectedly good!")

print("Predicted class:", result['predicted_class'].item())
print("Entropy:", result['entropy'].item())
print("Confidence:", result['probs'].max().item())
print("Per-class std dev:", result['std_dev'])
```

---

## 📊 Output Dictionary

- `predicted_class`: index of most probable class
- `entropy`: predictive entropy (higher = less confident)
- `std_dev`: standard deviation across MC passes for each class
- `mean_logits`: average logits before softmax
- `probs`: softmax probabilities

---

## 🧠 Why Dropwise?

> Unlike deterministic predictions, Dropwise estimates uncertainty via stochastic forward passes — enabling confidence-aware applications.

**Use cases include:**

- Filtering low-confidence predictions
- Active learning and semi-supervised setups
- Detecting ambiguous, adversarial, or out-of-distribution inputs
- Enhancing interpretability and robustness in real-world deployment

---

## 📂 Folder Structure

```
dropwise/
├── dropwise/
│   ├── __init__.py
│   └── predictor.py
├── tests/
│   ├── __init__.py
│   └── test_predictor.py
├── setup.py
├── README.md
├── LICENSE
```

---

## 🧪 Running Tests

```bash
python tests/test_predictor.py
```

---

## 📝 License

MIT License

---

Made with ❤️ for uncertainty-aware, explainable AI. [Coming soon to PyPI.]
