Skip to main content

ASDL: Automatic Second-order Differentiation (for Fisher, Gradient covariance, Hessian, Jacobian, and Kernel) Library

Project description

ASD(FGHJK)L

The library is called "ASDL", which stands for Automatic Second-order Differentiation (for Fisher, Gradient covariance, Hessian, Jacobian, and Kernel) Library. ASDL is a PyTorch extension for computing 1st/2nd-order metrics and performing 2nd-order optimization of deep neural networks.

You can import asdfghjkl by sliding your finger on a QWERTY keyboard :innocent:

import asdfghjkl

ADL vs ASDL

Basic metrics supported by a standard automatic differentiation libarary (ADL)

metric definition
neural network
loss
(averaged) gradient

Advanced 1st/2nd-order metrics (FGHJK) supported by ASDL

metric definition
Fisher information matrix
Fisher information matrix (MC estimation)
empirical Fisher
Gradient covariance
Hessian
Jacobian (per example)
Jacobian
Kernel

Matrix approximations

Supported operations

  • matrix-vector product
    • power method
    • conjugate gradient method
  • preconditioning gradient

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

asdfghjkl-0.1a2.tar.gz (32.3 kB view hashes)

Uploaded Source

Built Distribution

asdfghjkl-0.1a2-py3-none-any.whl (38.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page