You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@lethienhoa
Yes, there need to be updated about NLLLoss norm term.
But I am also confused why loss is not divided in terms of norm_term before doing loss.backward()?
Hi,
It seems that Perplexity is normalized twice & norm_term of NLLLoss should be masked out as well.
The text was updated successfully, but these errors were encountered: