You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Great work!
I find it works well for X and Y with its own encoder, but for some reason, I have to use the setting:
X and Y is with the same shape, X_i and Y_i is the positive sample, X_i and all Y_js are negative samples, X and Y are fed into the same network(as below). The loss function is contrastive_loss.
Great work!
I find it works well for X and Y with its own encoder, but for some reason, I have to use the setting:
X and Y is with the same shape, X_i and Y_i is the positive sample, X_i and all Y_js are negative samples, X and Y are fed into the same network(as below). The loss function is contrastive_loss.
I find the example you give is like:
How could I use GradCache in this setting? Should I store two RandContext for X and Y?
Looking forward to your help.
Thanks a lot~
The text was updated successfully, but these errors were encountered: