-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SoccerNet models #2
Comments
Hi, thank you for the question! All of the training parameters are in the supplemental material PDF (available on the website). We will consider uploading weights, but we initially refrained from doing so to avoid becoming another feature that gets concatenated with the other commonly used features. The scripts to pre-process the SoccerNet dataset are in the repo. |
Hey @jhong93, I'm too interested in the trained SoccerNet model if you don't mind. I trained according to the parameters mentioned in the paper but can only reach around 57 mAP for the loose and 41 mAP for the tight metric for the 200MF regnet model. I wonder what I did wrong. I hope you don't mind that I've personally emailed you @ cs.standford.edu to request for the trained weight if you think that it's not inconvenience for you! Thank you for open-sourcing the repo btw! |
Hi, I've updated the repository with a few of the SoccerNet models; hopefully the config information is helpful. Otherwise, it could be a dataset pre-processing issue. I would hope that
|
Many thanks for the weights, it's super helpful in diagnosing the issue! |
Hi @jhong93, currently I am reproducing your model using the SoccerNetV2 dataset. However, I did not receive the result as the paper report. Particular, for the test set, I use the model soccer_rny008gsm_gru_rgb and the result (w/o NMS) is: {
"test_split": {
"Average-mAP (loose)": 51.710384220920616,
"Shown only (loose)": 57.345404094772704,
"Unshown only (loose)": 22.07061351534247,
"Average-mAP (tight)": 45.13071156736224,
"Shown only (tight)": 50.76184780989905,
"Unshown only (tight)": 16.12071242855153
}
} Also, the nearly same result has been shown for the challenge split with model soccer_challenge_rny008gsm_gru_rgb: {
"challenge_split": {
"Average-mAP (loose)": 48.224891761119245,
"Shown only (loose)": 54.50310379766974,
"Unshown only (loose)": 37.3107990125073,
"Average-mAP (tight)": 45.45153077812651,
"Shown only (tight)": 52.18741683411739,
"Unshown only (tight)": 33.10071957727419
}
} The inference phase is run on the frame extracted from the 224p video. Can you help me diagnose the problem? Are there any different settings (from default setting) in the frame-extracting phase? Moreover, in the prediction file, I see that the FPS is "2.0833333333333335" so is that correct or it must be exactly "2"? Thank you! |
Thanks a lot for your help @jhong93, I have found that the problem is that we must use the "prediction.recall.json" file instead of "prediction.json". |
Thanks for the great work and code.
Are you going to publish the SoccerNet models as well (or at least the training parameters you used for your papers) ?
The text was updated successfully, but these errors were encountered: