A tiny Autograd engine whose only dependency is NumPy. Implements backpropagation (reverse-mode autodiff) over a dynamically built DAG and a small neural networks library on top of it with a PyTorch-like API. Both are tiny.
- 24 kilobytes code without tests
- as portable as Python and NumPy
- loads 10x faster than PyTorch, 30x faster than TensorFlow
This version is capable of working with matrices and higher-order tensors. For @karpathy's original scalar-based version, locate the code with tag scalar.
aglassofred•3h ago
- 24 kilobytes code without tests - as portable as Python and NumPy - loads 10x faster than PyTorch, 30x faster than TensorFlow
This version is capable of working with matrices and higher-order tensors. For @karpathy's original scalar-based version, locate the code with tag scalar.