Skip to main content

Learning Rate Free Learning for Adam, SGD and AdaGrad

Project description

D-Adaptation

Learning rate free learning for SGD, AdaGrad and Adam!

by Aaron Defazio and Konstantin Mishchenko (Arxiv)

pip install dadaptation

Details

The provided Pytorch Optimizer classes are drop-in replacements, either copy into your project or use via dadaptation.DAdaptSGD, dadaptation.DAdaptAdam or dadaptation.DAdaptAdaGrad.

  • Set the LR parameter to 1.0. This parameter is not ignored, rather, setting it larger to smaller will directly scale up or down the D-Adapted learning rate.
  • Use the same learning rate scheduler you would normally use on the problem.
  • The Adam variant supports AdamW style weight decay, just set decouple=True. It is not turned on by default, so if you are replacing your adam implementation, make sure you use decoupled if necessary.
  • It may be necessary to use larger weight decay than you would normally use, try a factor of 2 or 4 bigger if you see overfitting. D-Adaptation uses larger learning rates than people typically hand-choose, in some cases that requires more decay.
  • Use the log_every setting to see the learning rate being used (d*lr) and the current D bound.
  • Only the AdaGrad version supports sparse gradients.
  • The Adam IP variant implements a tighter D bound, which may help on some problems. The IP variants should be considered experimental.
  • If you encounter divergence early on, and are not already using learning rate warmup, try change growth_rate to match a reasonable warmup schedule rate for your problem.

Experimental results

vision vision vision vision vision vision vision vision vision vision

License

See the License file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dadaptation-1.5.tar.gz (8.3 kB view details)

Uploaded Source

File details

Details for the file dadaptation-1.5.tar.gz.

File metadata

  • Download URL: dadaptation-1.5.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.3

File hashes

Hashes for dadaptation-1.5.tar.gz
Algorithm Hash digest
SHA256 f9f2af063476d32c68ac8581f88636dd0c5f08b903ca63a660737eb1fece7829
MD5 3e7d70b8898922adb57815a335db6e65
BLAKE2b-256 0c232eca6877006e8ae42a24ca225a64d1efcc1cb0c2be5347c8676fdd06916a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page