Skip to main content

Parallel random matrix tools and random matrix theory deep learning applications. Generate matrices from Circular Unitary Ensemble (CUE), Circular Ortogonal Ensemble (COE) and Circular Symplectic Ensemble (CSE). Additional spectral analysis utilities are also implemented, such as computation of spectral density and spectral ergodicity for complexity of deep learning architectures.

Project description

Bristol

Build Status Coverage Status PyPI version Downloads Downloads arXiv:1704.08303 Zenodo:Archive:v0.1.8 Zenodo:Surrogate Matrices Data arXiv:1911.07831 arXiv:2006.13687

Parallel random matrix tools and random matrix theory deep learning applications. Generate matrices from Circular Unitary Ensemble (CUE), Circular Ortogonal Ensemble (COE) and Circular Symplectic Ensemble (CSE). Additional spectral analysis utilities are also implemented, such as computation of spectral density and spectral ergodicity for complexity of deep learning architectures.

Features

  • Generation of Circular Ensembles: CUE, COE and CSE.
  • Random matrices: Reproducibility both in serial and parallel processing.
  • Eigenvalue Spectra, spectral densitiy.
  • Kullbach-Leibler divergence and spectral ergodicity measure functionality.
  • Cascading Periodic Spectral Ergodicity (cPSE)

Installation

Install with pip from pypi.

pip install bristol

To use the latest development version

pip install -upgrade git+https://github.com/msuzen/bristol.git

Documentation

Complexity of a deep learning model: cPSE

Vanilla case

In the vanilla case a list of matrices that are representative of ordered set of weight matrices can be used to compute cPSE over layers. As an examples:

from bristol import cPSE
import numpy as np
np.random.seed(42)
matrices = [np.random.normal(size=(64,64)) for _ in range(10)]
(d_layers, cpse) = cPSE.cpse_measure_vanilla(matrices)

Even for set of Gaussian matrices, d_layers decrease. Note that different layer types should be converted to a matrix format, i.e., CNNs to 2D matrices. See the main paper.

For torch models

You need to put your model as pretrained model format of PyTorch. An example for vgg, and use cPSE.cpse_measure function simply:

from bristol import cPSE
import torchvision.models as models
netname = 'vgg11'
pmodel = getattr(models, netname)(pretrained=True)
(d_layers, cpse) = cPSE.cpse_measure(pmodel)

This would give cpse a single number expressing the complexity of your network and d_layers evolution of periodic spectral ergodicity withing layers as a vector, order matters.

Prototype notebooks

  • Basics of circular ensembles ipynb.

  • Computing spectral ergodicity for generated matrices ipynb. This is to reproduce the main figure from arXiv:1704.08693.

  • The concept of cascading periodic ergodicity (cPSE) ipynb This is only to reproduce paper's results from arXiv:1911.07831.

  • Empirical deviations of semicircle law in mixed-matrix ensembles,
    M. Suezen, hal-03464130 | ipynb Reproduces the work with the same title.

Contact

  • Please create an issue for any type of questions or contact msuzen.

References

  • Berry, M V & Pragya Shukla 2013, Hearing random matrices and random waves, New. J. Phys. 15 013026 (11pp) berry456

  • Spectral Ergodicity in Deep Learning Architectures via Surrogate Random Matrices, Mehmet Süzen, Cornelius Weber, Joan J. Cerdà, arXiv:1704.08693

  • Periodic Spectral Ergodicity: A Complexity Measure for Deep Neural Networks and Neural Architecture Search, Mehmet Süzen, Cornelius Weber, Joan J. Cerdà, arXiv:1911.07831

  • Empirical deviations of semicircle law in mixed-matrix ensembles,
    M. Suezen, hal-03464130 | ipynb Reproduces the work with the same title.

Citation

If you use the ideas or tools from this package please do cite our manuscripts.

@article{suezen2017a,
    title={Spectral Ergodicity in Deep Learning Architectures via Surrogate Random Matrices},
    author={Mehmet Süzen and Cornelius Weber and Joan J. Cerdà},
    year={2017},
    eprint={1704.08303},
    archivePrefix={arXiv},
    primaryClass={stat.ML}
}
@article{suezen2019a,
    title={Periodic Spectral Ergodicity: A Complexity Measure for Deep Neural Networks and Neural Architecture Search},
    author={Mehmet Süzen and Cornelius Weber and Joan J. Cerdà},
    year={2019},
    eprint={1911.07831},
    archivePrefix={arXiv},
    primaryClass={stat.ML}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bristol-0.2.12.tar.gz (26.7 kB view details)

Uploaded Source

File details

Details for the file bristol-0.2.12.tar.gz.

File metadata

  • Download URL: bristol-0.2.12.tar.gz
  • Upload date:
  • Size: 26.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.23.0 requests-toolbelt/0.8.0 tqdm/4.62.3 CPython/3.7.3

File hashes

Hashes for bristol-0.2.12.tar.gz
Algorithm Hash digest
SHA256 c05286d13e551bcbe823af3626e7d29ffe7db517131d3c9a78caa3fcbad4c658
MD5 b11661c438ab8c7ecd4019123a0e5124
BLAKE2b-256 2b29a01629a8aa715a899a5d62936e2d6431b4c0cad9c54114ba5dc17dd4cb02

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page