Skip to content

Commit

Permalink
Fix missing sign in cross-entropy & NLL
Browse files Browse the repository at this point in the history
  • Loading branch information
andreasgrv committed Dec 16, 2024
1 parent 8919354 commit a54e356
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions notebooks/generative-vs-discriminative-circuit.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -268,7 +268,7 @@
"The discriminative training loss is the **cross-entropy** between the target distribution, $p^*$, and our predicted distribution, $p$:\n",
"$$\n",
"\\begin{align}\n",
"\\mathcal{L}_{dis}(\\theta) &= \\frac{1}{N} \\sum_{i=1}^N \\underbrace{\\sum_{y' \\in \\{0,\\ldots,9\\}} p^*(y' \\mid \\mathbf{x}^{(i)}) \\log p(y' \\mid \\mathbf{x}^{(i)})}_{\\text{cross-entropy}}\n",
"\\mathcal{L}_{dis}(\\theta) &= -\\frac{1}{N} \\sum_{i=1}^N \\underbrace{\\sum_{y' \\in \\{0,\\ldots,9\\}} p^*(y' \\mid \\mathbf{x}^{(i)}) \\log p(y' \\mid \\mathbf{x}^{(i)})}_{\\text{cross-entropy}}\n",
"\\end{align}\n",
"$$\n",
"\n",
Expand All @@ -286,9 +286,9 @@
"\n",
"$$\n",
"\\begin{align}\n",
"\\mathcal{L}_{dis}(\\theta) &= \\frac{1}{N} \\sum_{i=1}^N \\sum_{y' \\in \\{0,\\ldots,9\\}} p^*(y' \\mid \\mathbf{x}^{(i)}) \\log p(y' \\mid \\mathbf{x}^{(i)}) & \\\\\n",
" &= \\frac{1}{N}\\sum_{i=1}^N \\left( 1 \\log p(y^{(i)} \\mid \\mathbf{x}^{(i)}) + \\sum_{y' \\neq y^{(i)}} 0 \\log p(y' \\mid \\mathbf{x}^{(i)}) \\right) & \\text{$p^*$ is one-hot} \\\\\n",
" &= \\frac{1}{N} \\sum_{i=1}^N \\log p(y^{(i)} \\mid \\mathbf{x}^{(i)}) & \\text{negative log-likelihood}\n",
"\\mathcal{L}_{dis}(\\theta) &= -\\frac{1}{N} \\sum_{i=1}^N \\sum_{y' \\in \\{0,\\ldots,9\\}} p^*(y' \\mid \\mathbf{x}^{(i)}) \\log p(y' \\mid \\mathbf{x}^{(i)}) & \\\\\n",
" &= - \\frac{1}{N}\\sum_{i=1}^N \\left( 1 \\log p(y^{(i)} \\mid \\mathbf{x}^{(i)}) + \\sum_{y' \\neq y^{(i)}} 0 \\log p(y' \\mid \\mathbf{x}^{(i)}) \\right) & \\text{$p^*$ is one-hot} \\\\\n",
" &= - \\frac{1}{N} \\sum_{i=1}^N \\log p(y^{(i)} \\mid \\mathbf{x}^{(i)}) & \\text{negative log-likelihood}\n",
"\\end{align}\n",
"$$\n",
"\n",
Expand Down

0 comments on commit a54e356

Please sign in to comment.