You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a question about the function mentioned above. When calculating the final loss, the partition of different samples is multiplied together. Through masking, each row of data is restricted to calculate the product between the jth partition of the ith sample and the partitions of the other samples, and then the final loss is obtained. In contrastive learning, the general loss function for optimization is exp(zizj)/exp(zizk), where the numerator is the similarity between the same sample's different partitions and the denominator is the similarity between different samples' partitions. The goal is to make the numerator small and the denominator large. However, in the aforementioned function, only the denominator is visible and the numerator is not present. Is this my misunderstanding or is there a problem?
the loss is compute the sample i,
The text was updated successfully, but these errors were encountered:
def self_supervised_contrastive_loss(self, features):
'''Compute the self-supervised VPCL loss.
I have a question about the function mentioned above. When calculating the final loss, the partition of different samples is multiplied together. Through masking, each row of data is restricted to calculate the product between the jth partition of the ith sample and the partitions of the other samples, and then the final loss is obtained. In contrastive learning, the general loss function for optimization is exp(zizj)/exp(zizk), where the numerator is the similarity between the same sample's different partitions and the denominator is the similarity between different samples' partitions. The goal is to make the numerator small and the denominator large. However, in the aforementioned function, only the denominator is visible and the numerator is not present. Is this my misunderstanding or is there a problem?
the loss is compute the sample i,
The text was updated successfully, but these errors were encountered: