Lightweight Automatic Differentiation in Python.
Project description
pygrad: A lightweight differentiation engine written in Python.
Documentation: https://baubels.github.io/pygrad/.
This is a lightweight (<300kB) automatic differentiation engine based on NumPy, Numba, and opt_einsum. Included is a differentiable Tensor class, layers such as Dropout/Linear/Attention, loss functions such as BCE/CCE, optimizers such as SGD/RMSProp/Adam, and an example DNN/CNN/Transformer architecture. This library is a good alternative if you want to do backpropagation on simple and small functions or networks, without much overhead.
The main component is the Tensor class supporting many math operations. Tensors have .value and .grad attributes, gradients being populated by calling .backward() on either self or any of its children. They can be used standalone, or for constructing more complex architectures such as a vanilla Transformer.
Installation
pip install pygradproject
# OR
git clone https://github.com/baubels/pygrad.git
pip install . (or .[examples] or .[dev])
Usage
Tensors accept the same input value as a NumPy array. Create them with Tensor(value) or tensor.array(value).
Run backprop on them with .backward().
A simple usage example:
from pygrad.tensor import Tensor
x = Tensor(1)
(((x**3 + x**2 + x + 1) - 1)**2).backward()
x.value, x.grad # 1.0, 36.0
Since Tensor store their value in .value and their gradient in .grad, it's easy to perform gradient descent.
for _ in range(100):
(((x**3 + x**2 + x + 1) - 1)**2).backward() # gradients are automatically reset when called
x.value = x.value - 0.01*x.grad
Tensors can also be operated on with broadcast-friendly NumPy arrays or other Tensors whose value is broadcast friendly. Internally, a Tensor will always cast it's set value to a NumPy array.
import numpy as np
x = Tensor(np.ones((10,20)))
y = Tensor(np.ones((20,10)))
z1 = x@y
z2 = x@np.ones((20,10))
np.all(z1.value == z2.value) # True
There are enough expressions defined to be able to create many different models. For example usage and in-depth descriptions of each component of pygrad, check out the docs.
Citation/Contribution
If you find this project helpful in your research or work, I kindly ask that you cite it: View Citation. Thank you!
If there are issues with the project, please submit an issue. Otherwise, please read the current status for contributors.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pygradproject-0.0.3.tar.gz.
File metadata
- Download URL: pygradproject-0.0.3.tar.gz
- Upload date:
- Size: 30.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.11.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
29b0d54d9b22edc252508ddc90fd38ca0fa7823ef9d0cfb8cf801cd5e4a1a5e2
|
|
| MD5 |
def2db0a397fdce873e52696def4e7fe
|
|
| BLAKE2b-256 |
d17ab35dd7261906174ab31f45e5cd8092f8f0a97f5d658fb47c45fa77b64b58
|
File details
Details for the file pygradproject-0.0.3-py3-none-any.whl.
File metadata
- Download URL: pygradproject-0.0.3-py3-none-any.whl
- Upload date:
- Size: 25.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.11.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f680978359c199f039c187abb421ba8623bb464c031ece773163a687f4d6b393
|
|
| MD5 |
967a32e0002f8a51c2cd7a95068ba88c
|
|
| BLAKE2b-256 |
7bf0ed0acb98d0fed08cbf745b850d5b5799a44d8e19c16acdc2dedf3172166e
|