Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed typo. #197

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/Module3_IntroducingNumpy/AutoDiff.html
Original file line number Diff line number Diff line change
Expand Up @@ -439,7 +439,7 @@ <h2>Introduction to MyGrad<a class="headerlink" href="#Introduction-to-MyGrad" t
</pre></div>
</div>
<p>As expected, MyGrad computes the appropriate value for the evaluated derivative: <span class="math notranslate nohighlight">\(\frac{\mathrm{d}f}{\mathrm{d}x}\big|_{x=5}=2 \times 5=10\)</span>. Note that all <code class="docutils literal notranslate"><span class="pre">Tensor</span></code> instances have a <code class="docutils literal notranslate"><span class="pre">grad</span></code> attribute, but prior to invoking <code class="docutils literal notranslate"><span class="pre">fx.backward()</span></code>, <code class="docutils literal notranslate"><span class="pre">x.grad</span></code> would have simply returned <code class="docutils literal notranslate"><span class="pre">None</span></code>.</p>
<p>It is important to reiterate that MyGrad <em>never gives us the actual function</em> <span class="math notranslate nohighlight">\(\frac{\mathrm{d}f}{\mathrm{d}x}\)</span>; it only computes the derivative evaluated at a specific input <span class="math notranslate nohighlight">\(x=10\)</span>.</p>
<p>It is important to reiterate that MyGrad <em>never gives us the actual function</em> <span class="math notranslate nohighlight">\(\frac{\mathrm{d}f}{\mathrm{d}x}\)</span>; it only computes the derivative evaluated at a specific input <span class="math notranslate nohighlight">\(x=5\)</span>.</p>
<div class="section" id="MyGrad-Adds-“Drop-In”-AutoDiff-to-NumPy">
<h3>MyGrad Adds “Drop-In” AutoDiff to NumPy<a class="headerlink" href="#MyGrad-Adds-“Drop-In”-AutoDiff-to-NumPy" title="Permalink to this headline"></a></h3>
<p>MyGrad’s functions are intentionally designed to mirror NumPy’s functions almost exactly. In fact, for all of the NumPy functions that MyGrad mirrors, we can pass a tensor to a NumPy function and it will be “coerced” into returning a tensor instead of a NumPy array – thus we can differentiate through NumPy functions!</p>
Expand Down Expand Up @@ -840,4 +840,4 @@ <h2>Reading Comprehension Exercise Solutions<a class="headerlink" href="#Reading
</script>

</body>
</html>
</html>
2 changes: 1 addition & 1 deletion docs/Module3_IntroducingNumpy/AutoDiff.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"It is important to reiterate that MyGrad *never gives us the actual function* $\\frac{\\mathrm{d}f}{\\mathrm{d}x}$; it only computes the derivative evaluated at a specific input $x=10$."
"It is important to reiterate that MyGrad *never gives us the actual function* $\\frac{\\mathrm{d}f}{\\mathrm{d}x}$; it only computes the derivative evaluated at a specific input $x=5$."
]
},
{
Expand Down