Skip to content

Releases: chris-dare/mytorch

Autograd set up

01 Oct 08:18
Compare
Choose a tag to compare
Autograd set up Pre-release
Pre-release

In this release, I have implemented autograd. This always any function I write for mytorch to be automatically differentiated during backpropagation. The neural networks is built as a computation graph (DAG). And complex functions are built from simple ones to do some really serious stuff.