Metadata-Version: 2.3
Name: pytorch-utilities
Version: 0.0.1
Summary: Torch-Utils, a library containing all necessary and new DL development utilities using PyTorch.
Project-URL: Homepage, https://github.com/arawxx/torch-utils
Project-URL: Repository, https://github.com/arawxx/torch-utils
Author-email: Arash Hajian nezhad <arash.hajiannezhad@gmail.com>
License-Expression: MIT
License-File: LICENSE
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Information Technology
Classifier: Intended Audience :: System Administrators
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Internet
Classifier: Topic :: Software Development
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Utilities
Requires-Python: >=3.8
Requires-Dist: torch<=2.2,>=1.8.0
Description-Content-Type: text/markdown

# torch-utils

This repository contains useful functions and classes for Deep Learning engineers using PyTorch.

# Installation

You can install this package using pip. The name of the package in PyPI is **pytorch-utilities**:

`pip install pytorch-utilities`

## Cosine Annealing with Linear Warmup Learning Rate

Using this scheduler is as simple as using a default PyTorch scheduler.

Example usage:

```python
import torch
from torch.optim import AdamW
from torchutils.schedulers import CosineAnnealingLinearWarmup


# Initialize your model and dataloader
# model = ...
# dataloader = ...
# loss_fn = ...

# Initialize the optimizer and scheduler
optimizer = AdamW(model.parameters(), lr=0.0005)
scheduler = CosineAnnealingLinearWarmup(optimizer, warmup_epochs=5, max_epochs=100)

# If you want to step the scheduler after each iteration (batch), adjust the warmup_epochs and max_epochs accordingly
# scheduler = CosineAnnealingLinearWarmup(optimizer, warmup_epochs=5 * len(dataloader), max_epochs=100 * len(dataloader))

# Training loop
for epoch in range(100):
    for inputs, targets in dataloader:
        optimizer.zero_grad()
      
        # Forward pass
        outputs = model(inputs)
      
        # Compute loss
        loss = loss_fn(outputs, targets)
      
        # Backward pass and optimization
        loss.backward()
        optimizer.step()

        # If you want to step the scheduler after each iteration (batch), uncomment the following line
        # scheduler.step()
      
    # If you're stepping the scheduler after each epoch, do it here
    scheduler.step()
```
