Thanks to visit codestin.com
Credit goes to github.com

Skip to content

A tiny autograd engine and a neural net library on top of it, potentially for educational purposes

License

Notifications You must be signed in to change notification settings

sinjax/micrograd

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MicroGrad

awww

A tiny Autograd engine (with a bite! :D). Implements backpropagation over a dynamically built DAG and a small neural networks library on top of it with a PyTorch-like API. Both are currently about 50 lines of code each.

The amusing part is that the DAG only allows individual scalar values, so e.g. we chop up each neuron into all of its individual tiny adds and multiplies. In particular, the current library only supports scalars and three operations over them: (+,*,relu), but these are actually enough to build up an entire deep neural net doing binary classification as the demo notebook shows. It's just a lot of nodes :)

Potentially useful for educational purposes. See the notebook for demo.

About

A tiny autograd engine and a neural net library on top of it, potentially for educational purposes

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 95.6%
  • Python 4.4%