Skip to main content

ABEL Scheduler

Project description

How to decay your Learning Rate (PyTorch)

PyTorch implementation of ABEL LRScheduler based on weight-norm. If you find this work interesting, do consider starring the repository. If you use this in your research, don't forget to cite!

Original paper

Docs

Installation

WIP - not available on PyPi yet.

pip install abel-pytorch

Usage

import torch
from torch import nn, optim
from abel import ABEL

model = resnet18()
optim = optim.SGD(model.parameters(), 1e-3)
scheduler = ABEL(optim, 0.9)

for i, (images, labels) in enumerate(trainloader):
  # forward pass...
  optim.step()
  scheduler.step()

Cite original paper:

@article{lewkowycz2021decay,
  title={How to decay your learning rate},
  author={Lewkowycz, Aitor},
  journal={arXiv preprint arXiv:2103.12682},
  year={2021}
}

Cite this work:

@misc{abel2021pytorch,
  author = {Vaibhav Balloli},
  title = {A PyTorch implementation of ABEL},
  year = {2021},
  howpublished = {\url{https://github.com/tourdeml/abel-pytorch}}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

abel-pytorch-0.0.1.tar.gz (3.1 kB view hashes)

Uploaded Source

Built Distribution

abel_pytorch-0.0.1-py3-none-any.whl (4.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page