-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Strange behavior with randomness #8
Comments
Me too. I could not reproduce the results on BA3, and the ACC-AUC of Refine is around 0.54-0.55, which is far less than the reported number in the paper. I agree with you that the performance is very dependent on the choice of random seed. |
Note that by running You can try to process the trained model to the fine-tuning phase by running Plus, did you re-train the GNN model? I found the results could also be sensitive to different GNN models. Thus it would be better to try a few different GNN models under different seeds. |
On first run, say if I clone the repo and train with command
python3 refine_train.py --dataset ba3 --hid 50 --epoch 1 --ratio 0.4 --lr 1e-4
, I always get the same ACC-AUC of 0.518. On second and all subsequent runs, this same command gives me ACC-AUC 0.490.This happens with any number of epochs but is easiest to verify with just one. It seems like something is not quite working with the random seed on first run (although this first run is still seeded as it produces the same result of 0.518 every time), and then once it is trained once, the seed starts working. I even cloned the repo several times again and this same pattern always happened.
I tried to fix this myself but couldn't. Now, this isn't really critical to fix or anything, but I think it's good to mention as it caused quite a bit of confusion for me when testing the code.
Torch version is 1.8.0 because I couldn't get 1.6.0 to work with
torch-scatter
. Otherwise the setup is the same as in READMEThe text was updated successfully, but these errors were encountered: