You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you very much for providing the open-source project. However, I encountered some confusion while using your code and would appreciate your help.
First, I ran the command “CUDA_VISIBLE_DEVICES=0 python model_train.py --dataset cifar10 --network resnet18 --phase train” to train a clean model, and the accuracy was normal. Then, I ran the command “CUDA_VISIBLE_DEVICES=0 python eval_linearity.py --attack clean --dataset cifar10 --network resnet18” to evaluate the linearity of the clean model. However, the result shows a linearity score of 0.9911737153566863, while the result in Table 4 of the original paper is 0.47. This discrepancy is quite significant, and I am unsure about the cause.
I look forward to your reply and greatly appreciate you.
The text was updated successfully, but these errors were encountered:
Hi @sincere0909 , thank you for your interest in our work, you can play with the _k and the linear_inputs parameters. The linearity scores of a clean model can fluctuate more and have lower values, and backdoored models yield stronger linearity.
Thank you very much for providing the open-source project. However, I encountered some confusion while using your code and would appreciate your help.
First, I ran the command “CUDA_VISIBLE_DEVICES=0 python model_train.py --dataset cifar10 --network resnet18 --phase train” to train a clean model, and the accuracy was normal. Then, I ran the command “CUDA_VISIBLE_DEVICES=0 python eval_linearity.py --attack clean --dataset cifar10 --network resnet18” to evaluate the linearity of the clean model. However, the result shows a linearity score of 0.9911737153566863, while the result in Table 4 of the original paper is 0.47. This discrepancy is quite significant, and I am unsure about the cause.
I look forward to your reply and greatly appreciate you.
The text was updated successfully, but these errors were encountered: