Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reconstruction guidance mismatch with the paper #3

Open
XZWY opened this issue Oct 19, 2024 · 3 comments
Open

reconstruction guidance mismatch with the paper #3

XZWY opened this issue Oct 19, 2024 · 3 comments

Comments

@XZWY
Copy link

XZWY commented Oct 19, 2024

norm=torch.linalg.norm(y-den_rec,dim=dim,` ord=2)
rec_grads=torch.autograd.grad(outputs=norm, inputs=x)

rec_grads=rec_grads[0]

normguide=torch.linalg.norm(rec_grads)/x.shape[-1]**0.5

#normalize scaling
s=self.xi/(normguide*t_i+1e-6)

#optionally apply a treshold to the gradients
if self.treshold_on_grads>0:
    #pply tresholding to the gradients. It is a dirty trick but helps avoiding bad artifacts 
    rec_grads=torch.clip(rec_grads, min=-self.treshold_on_grads, max=self.treshold_on_grads)


score=(x_hat.detach()-x)/t_i**2

Inside the sampler, the norm guide and error norm are all normalized, but they should all be the power of the error and the gradient, which seems to be mismatching the paper. Is this on purpose?

@eloimoliner
Copy link
Owner

Hi, I'm not sure I understand the issue.
Do you mean that the norm should be squared?
If so, note that after normalizing the gradients with the gradient norm, both using the squared and not squared norm gives equivalent normalized gradients.

@XZWY
Copy link
Author

XZWY commented Oct 19, 2024

I see, thanks! then I assume in the paper it also should be normalizing by the gradient norm $$||G||_2$$ instead of the squared norm $$||G||_2^2$$ right? As in the image below
Screenshot 2024-10-19 120721

@eloimoliner
Copy link
Owner

Oh yes, you are right. That looks like a typo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants