Releases: chris-dare/mytorch
Releases · chris-dare/mytorch
Autograd set up
In this release, I have implemented autograd. This always any function I write for mytorch to be automatically differentiated during backpropagation. The neural networks is built as a computation graph (DAG). And complex functions are built from simple ones to do some really serious stuff.