diff --git a/docs/Module3_IntroducingNumpy/AutoDiff.html b/docs/Module3_IntroducingNumpy/AutoDiff.html index 85c620ab..4c946b90 100644 --- a/docs/Module3_IntroducingNumpy/AutoDiff.html +++ b/docs/Module3_IntroducingNumpy/AutoDiff.html @@ -439,7 +439,7 @@
As expected, MyGrad computes the appropriate value for the evaluated derivative: \(\frac{\mathrm{d}f}{\mathrm{d}x}\big|_{x=5}=2 \times 5=10\). Note that all Tensor
instances have a grad
attribute, but prior to invoking fx.backward()
, x.grad
would have simply returned None
.
It is important to reiterate that MyGrad never gives us the actual function \(\frac{\mathrm{d}f}{\mathrm{d}x}\); it only computes the derivative evaluated at a specific input \(x=10\).
+It is important to reiterate that MyGrad never gives us the actual function \(\frac{\mathrm{d}f}{\mathrm{d}x}\); it only computes the derivative evaluated at a specific input \(x=5\).
MyGrad’s functions are intentionally designed to mirror NumPy’s functions almost exactly. In fact, for all of the NumPy functions that MyGrad mirrors, we can pass a tensor to a NumPy function and it will be “coerced” into returning a tensor instead of a NumPy array – thus we can differentiate through NumPy functions!
@@ -840,4 +840,4 @@