Skip to main content

Validation of neural network equations in SymPy.

Project description

nerva-sympy

PyPI License: BSL-1.0

nerva-sympy provides symbolic validation of multilayer perceptron implementations using SymPy.
It is part of the Nerva project — a suite of Python and C++ libraries that provide well-specified, inspectable implementations of neural networks.

➡️ Unlike the other backends (nerva-torch, nerva-numpy, nerva-jax, nerva-tensorflow) which implement forward and backward passes, nerva-sympy is a testing library.
It derives exact symbolic gradients and checks them against the manually implemented backpropagation code.


🗺️ Overview

The nerva libraries aim to make neural networks mathematically precise and transparent.
While the implementation backends focus on execution, nerva-sympy ensures correctness:

  • Provides symbolic derivatives of activation functions, layers, and loss functions.
  • Validates handwritten backpropagation equations used in the other Nerva packages.
  • Detects implementation errors early by comparing intermediate symbolic and numeric results.
  • Avoids numerical gradient checking (finite differences) in favor of exact symbolic differentiation.

📦 Available Python Packages

Each backend has a dedicated PyPI package and GitHub repository:

Package Backend PyPI GitHub
nerva-jax JAX nerva-jax repo
nerva-numpy NumPy nerva-numpy repo
nerva-tensorflow TensorFlow nerva-tensorflow repo
nerva-torch PyTorch nerva-torch repo
nerva-sympy SymPy nerva-sympy repo

📝 nerva-sympy depends on the other four packages, since it validates their implementations.

See the nerva meta-repo for an overview of all Python and C++ variants.


🚀 Quick Start

Installation

The library can be installed in two ways: from the source repository or from the Python Package Index (PyPI).

# Install from the local repository
pip install .
# Install directly from PyPI
pip install nerva-sympy

Example: Validate Softmax Backpropagation

This example validates the gradient computation of the softmax layer.
The manually implemented backpropagation rules are checked against symbolic differentiation.

# Backpropagation equations
DZ = hadamard(Y, DY - row_repeat(diag(Y.T * DY).T, K))
DW = DZ * X.T
Db = rows_sum(DZ)
DX = W.T * DZ

# Symbolic reference
DW1 = gradient(loss(Y), w)
Db1 = gradient(loss(Y), b)
DX1 = gradient(loss(Y), x)
DZ1 = gradient(loss(Y), z)

# Check equivalence
assert equal_matrices(DW, DW1)
assert equal_matrices(Db, Db1)
assert equal_matrices(DX, DX1)
assert equal_matrices(DZ, DZ1)

🧪 Running Tests

Controls for output and verbosity

  • Individual test names (default): The helper script runs pytest with -v, so you see each test as it runs.
  • Suppress internal prints from tests: By default the test utilities do not print intermediate matrices/numbers. Set NERVA_TEST_VERBOSE=1 to enable those prints when needed.
  • Override pytest flags: Set NERVA_PYTEST_FLAGS to customize, e.g., NERVA_PYTEST_FLAGS="-ra -s" ./tests/run_all_tests.sh to also show print output from successful tests.
  • Unittest fallback: Uses -b (buffered) and -v by default.

To run the test suite locally:

  1. Install dependencies (pytest is optional but recommended for nicer output):
pip install -r requirements.txt
pip install pytest  # optional
  1. Run all tests via the helper script (it adds src to PYTHONPATH automatically):
./tests/run_all_tests.sh

You can pass additional arguments to the underlying test runner, for example:

# Run only tests whose names match "jacobian"
./tests/run_all_tests.sh -k "jacobian"

# Run a specific test file, class, or test case (pytest syntax)
./tests/run_all_tests.sh tests/test_softmax_functions.py::TestSoftmax::test_softmax

If you prefer to run without the script or are on a platform without bash:

# Using pytest
python3 -m pytest -s tests

# Or using unittest discovery
python3 -m unittest discover -s tests -p "test_*.py" -v

🧪 Validation Suite

The test suite covers activation functions, layers, loss functions, and matrix operations.
Each test compares symbolic derivatives to the manually implemented backpropagation code.

Available tests:

  • Activation Functions
    test_activation_functions.py, test_softmax_functions.py, test_softmax_function_derivations.py
    Validates symbolic gradients for Sigmoid, ReLU, SReLU, etc.

  • Layer Derivatives
    test_layer_linear.py, test_layer_softmax.py, test_layer_batch_normalization.py, test_layer_dropout.py, test_layer_srelu.py, test_layer_derivations.py
    Checks symbolic vs. manual backpropagation for individual layers.

  • Loss Functions
    test_loss_functions.py, test_loss_function_derivations.py Ensures correct gradient formulas for common loss functions.

  • Supporting Operations
    test_matrix_operations.py, test_one_hot.py, test_derivatives.py, test_lemmas.py
    Validates core symbolic building blocks.

Note on the cross-framework consistency test (tests/test_frameworks.py):

  • The test compares intermediates (X, Y, DY, T) across JAX, NumPy, TensorFlow and PyTorch while performing multiple SGD steps.
  • Currently, the test uses the AbsoluteError loss, which is not differentiable at 0. This sometimes causes the test to fail, even when all implementations are correct.

🔢 Implementation Philosophy

Unlike frameworks that rely on autograd or approximate numerical checks:

  • nerva-sympy uses exact symbolic differentiation.
  • This avoids floating-point instability and provides rigorous correctness guarantees.
  • All computations are expressed in batch matrix form, consistent with the other nerva libraries.
  • The test suite acts as a ground truth oracle for verifying implementations in JAX, NumPy, PyTorch, and TensorFlow.

📚 Relevant Papers

  1. Nerva: a Truly Sparse Implementation of Neural Networks

    arXiv:2407.17437 Introduces the library and reports sparse training experiments.

  2. Batch Matrix-form Equations and Implementation of Multilayer Perceptrons

    arXiv:2511.11918 Includes mathematical specifications and derivations.


📜 License

Distributed under the Boost Software License 1.0.
License file


🙋 Contributing

Bug reports and contributions are welcome via the GitHub issue tracker.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nerva_sympy-1.0.0.tar.gz (35.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nerva_sympy-1.0.0-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file nerva_sympy-1.0.0.tar.gz.

File metadata

  • Download URL: nerva_sympy-1.0.0.tar.gz
  • Upload date:
  • Size: 35.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for nerva_sympy-1.0.0.tar.gz
Algorithm Hash digest
SHA256 02abffd2156b322d2d46ffc6068e4b286fc51783ebd65808b1b0f30dac5c6223
MD5 57b93a0709aca13936c761404a0b193e
BLAKE2b-256 f0e3bbc8b1bb1aca560592bfe01063cc138398927cad16e4489c1a7a10e3208d

See more details on using hashes here.

File details

Details for the file nerva_sympy-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: nerva_sympy-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for nerva_sympy-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f890b6ee6a1ac34c13076f113c208c55ec8a9c74b93b5d162ac5b3ad5027737e
MD5 75946265c889463eec2659cf5e621c23
BLAKE2b-256 cd4d08943893e8b8744e8b0367517d36105f97d276b19aa928f793a64b7c7fd2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page