Validation of neural network equations in SymPy.
Project description
nerva-sympy
nerva-sympy provides symbolic validation of multilayer perceptron implementations using SymPy.
It is part of the Nerva project — a suite of Python and C++ libraries that provide well-specified, inspectable implementations of neural networks.
➡️ Unlike the other backends (nerva-torch, nerva-numpy, nerva-jax, nerva-tensorflow) which implement forward and backward passes, nerva-sympy is a testing library.
It derives exact symbolic gradients and checks them against the manually implemented backpropagation code.
🗺️ Overview
The nerva libraries aim to make neural networks mathematically precise and transparent.
While the implementation backends focus on execution, nerva-sympy ensures correctness:
- Provides symbolic derivatives of activation functions, layers, and loss functions.
- Validates handwritten backpropagation equations used in the other Nerva packages.
- Detects implementation errors early by comparing intermediate symbolic and numeric results.
- Avoids numerical gradient checking (finite differences) in favor of exact symbolic differentiation.
📦 Available Python Packages
Each backend has a dedicated PyPI package and GitHub repository:
| Package | Backend | PyPI | GitHub |
|---|---|---|---|
nerva-jax |
JAX | nerva-jax | repo |
nerva-numpy |
NumPy | nerva-numpy | repo |
nerva-tensorflow |
TensorFlow | nerva-tensorflow | repo |
nerva-torch |
PyTorch | nerva-torch | repo |
nerva-sympy |
SymPy | nerva-sympy | repo |
📝
nerva-sympydepends on the other four packages, since it validates their implementations.
See the nerva meta-repo for an overview of all Python and C++ variants.
🚀 Quick Start
Installation
The library can be installed in two ways: from the source repository or from the Python Package Index (PyPI).
# Install from the local repository
pip install .
# Install directly from PyPI
pip install nerva-sympy
Example: Validate Softmax Backpropagation
This example validates the gradient computation of the softmax layer.
The manually implemented backpropagation rules are checked against symbolic differentiation.
# Backpropagation equations
DZ = hadamard(Y, DY - row_repeat(diag(Y.T * DY).T, K))
DW = DZ * X.T
Db = rows_sum(DZ)
DX = W.T * DZ
# Symbolic reference
DW1 = gradient(loss(Y), w)
Db1 = gradient(loss(Y), b)
DX1 = gradient(loss(Y), x)
DZ1 = gradient(loss(Y), z)
# Check equivalence
assert equal_matrices(DW, DW1)
assert equal_matrices(Db, Db1)
assert equal_matrices(DX, DX1)
assert equal_matrices(DZ, DZ1)
🧪 Running Tests
Controls for output and verbosity
- Individual test names (default): The helper script runs pytest with -v, so you see each test as it runs.
- Suppress internal prints from tests: By default the test utilities do not print intermediate matrices/numbers. Set NERVA_TEST_VERBOSE=1 to enable those prints when needed.
- Override pytest flags: Set NERVA_PYTEST_FLAGS to customize, e.g., NERVA_PYTEST_FLAGS="-ra -s" ./tests/run_all_tests.sh to also show print output from successful tests.
- Unittest fallback: Uses -b (buffered) and -v by default.
To run the test suite locally:
- Install dependencies (pytest is optional but recommended for nicer output):
pip install -r requirements.txt
pip install pytest # optional
- Run all tests via the helper script (it adds src to PYTHONPATH automatically):
./tests/run_all_tests.sh
You can pass additional arguments to the underlying test runner, for example:
# Run only tests whose names match "jacobian"
./tests/run_all_tests.sh -k "jacobian"
# Run a specific test file, class, or test case (pytest syntax)
./tests/run_all_tests.sh tests/test_softmax_functions.py::TestSoftmax::test_softmax
If you prefer to run without the script or are on a platform without bash:
# Using pytest
python3 -m pytest -s tests
# Or using unittest discovery
python3 -m unittest discover -s tests -p "test_*.py" -v
🧪 Validation Suite
The test suite covers activation functions, layers, loss functions, and matrix operations.
Each test compares symbolic derivatives to the manually implemented backpropagation code.
Available tests:
-
Activation Functions
test_activation_functions.py,test_softmax_functions.py,test_softmax_function_derivations.py
Validates symbolic gradients for Sigmoid, ReLU, SReLU, etc. -
Layer Derivatives
test_layer_linear.py,test_layer_softmax.py,test_layer_batch_normalization.py,test_layer_dropout.py,test_layer_srelu.py,test_layer_derivations.py
Checks symbolic vs. manual backpropagation for individual layers. -
Loss Functions
test_loss_functions.py,test_loss_function_derivations.pyEnsures correct gradient formulas for common loss functions. -
Supporting Operations
test_matrix_operations.py,test_one_hot.py,test_derivatives.py,test_lemmas.py
Validates core symbolic building blocks.
Note on the cross-framework consistency test (tests/test_frameworks.py):
- The test compares intermediates (X, Y, DY, T) across JAX, NumPy, TensorFlow and PyTorch while performing multiple SGD steps.
- Currently, the test uses the AbsoluteError loss, which is not differentiable at 0. This sometimes causes the test to fail, even when all implementations are correct.
🔢 Implementation Philosophy
Unlike frameworks that rely on autograd or approximate numerical checks:
nerva-sympyuses exact symbolic differentiation.- This avoids floating-point instability and provides rigorous correctness guarantees.
- All computations are expressed in batch matrix form, consistent with the other
nervalibraries. - The test suite acts as a ground truth oracle for verifying implementations in JAX, NumPy, PyTorch, and TensorFlow.
📚 Relevant Papers
-
Nerva: a Truly Sparse Implementation of Neural Networks
arXiv:2407.17437 Introduces the library and reports sparse training experiments.
-
Batch Matrix-form Equations and Implementation of Multilayer Perceptrons
arXiv:2511.11918 Includes mathematical specifications and derivations.
📜 License
Distributed under the Boost Software License 1.0.
License file
🙋 Contributing
Bug reports and contributions are welcome via the GitHub issue tracker.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nerva_sympy-1.0.0.tar.gz.
File metadata
- Download URL: nerva_sympy-1.0.0.tar.gz
- Upload date:
- Size: 35.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
02abffd2156b322d2d46ffc6068e4b286fc51783ebd65808b1b0f30dac5c6223
|
|
| MD5 |
57b93a0709aca13936c761404a0b193e
|
|
| BLAKE2b-256 |
f0e3bbc8b1bb1aca560592bfe01063cc138398927cad16e4489c1a7a10e3208d
|
File details
Details for the file nerva_sympy-1.0.0-py3-none-any.whl.
File metadata
- Download URL: nerva_sympy-1.0.0-py3-none-any.whl
- Upload date:
- Size: 18.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f890b6ee6a1ac34c13076f113c208c55ec8a9c74b93b5d162ac5b3ad5027737e
|
|
| MD5 |
75946265c889463eec2659cf5e621c23
|
|
| BLAKE2b-256 |
cd4d08943893e8b8744e8b0367517d36105f97d276b19aa928f793a64b7c7fd2
|