A Warmup Scheduler for Pytorch
Project description
Warmup Scheduler Pytorch
Description
A Warmup Scheduler for Pytorch to make the warmup learning rate change at the beginning of training.
setup
Notice: need to install pytorch>=1.1.0 manually.
The official website of pytorch is: https://pytorch.org/
Then install as follows:
pip install warmup_scheduler_pytorch
Usage
Detail to see example.py file.
import torch
from torch.optim import SGD # example
from torch.optim.lr_scheduler import CosineAnnealingLR # example
from warmup_scheduler_pytorch import WarmUpScheduler
model = Model()
optimizer = SGD(model.parameters(), lr=0.1)
lr_scheduler = CosineAnnealingLR(optimizer, T_max=100, eta_min=0.01)
data_loader = torch.utils.data.DataLoader(...)
warmup_scheduler = WarmUpScheduler(optimizer, lr_scheduler,
len_loader=len(data_loader),
warmup_steps=100,
warmup_start_lr=0.01,
warmup_mode='linear')
epochs = 100
for epoch in range(epochs):
for batch_data in data_loader:
output = model(...)
# loss = loss_fn(output, ...)
# loss.backward()
optimizer.step()
optimizer.zero_grad()
warmup_scheduler.step()
# lr_scheduler.step() is no longer needed
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for warmup_scheduler_pytorch-0.1.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | b926e34e797308428b657df64fcca72f750d39821dbda072e44772e456b5d176 |
|
MD5 | 9270ee90a74d03bb86930b20f3abd7b7 |
|
BLAKE2b-256 | e8e9851dcbc19db041ace06d8b6485bcfd161c124291f1fa9efde66b499ac114 |
Close
Hashes for warmup_scheduler_pytorch-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ad69b6ce684fef9693f827b22153929a9e3a25924a36a3bcc35fb9eb376f281a |
|
MD5 | 04d5f42d4d88457bc7b61762dfe304b4 |
|
BLAKE2b-256 | 547bd3ccbffda9bfa1e80cffd9d47cffc4f9605ae4d6ad4893201f4a42610908 |