Metadata-Version: 2.3
Name: rawformer
Version: 0.2.0
Summary: Transformers from scratch
Author: Nicholas Jacobs
Author-email: Nicholas Jacobs <59352054+njacobs2019@users.noreply.github.com>
Requires-Dist: beartype>=0.22.9
Requires-Dist: einops>=0.8.2
Requires-Dist: jaxtyping>=0.3.9
Requires-Dist: torch>=2.9,<3
Requires-Dist: torch>=2.9,<3 ; extra == 'cpu'
Requires-Dist: torch>=2.9,<3 ; extra == 'cuda'
Requires-Python: >=3.12
Project-URL: Homepage, https://github.com/njacobs2019/rawformer
Project-URL: Issues, https://github.com/njacobs2019/rawformer/issues
Provides-Extra: cpu
Provides-Extra: cuda
Description-Content-Type: text/markdown

# rawformer

Transformers from scratch.


This library implements transformer architectures from pure PyTorch. It's not production ready, more so for my personal use and understanding.

Supported architectures:
- ViT

Supported positional encodings:
- Learned
- RoPE (1D and 2D)

## Installation

```bash
pip install rawformer
```

This library uses runtime checks to validate itself and throw better error messages early. NOTE: Beartype is incompatible with torch.compile.
- Turn off runtime type checking with env var: `BEARTYPE=0`
- Turn off python interpreter's assert statements with env var: `PYTHONOPTIMIZE=1`

## Developer install
```
uv sync --extra cpu
uv sync --extra cuda

pre-commit install
pre-commit run --all-files

uv pip install -e .
```
