-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Several detailed questions about reproduction #7
Comments
|
Thanks for your reply. I think I need to explain a little bit detailed for my 3rd problem: In the line 175 of predict.py, the code is Another question is still about the 2nd problem. I am not sure how to train the model by using sbt dataset and challenge dataset. I just omit several lines about "sml" in train.py because sbt data set does not contain "sml.tok" like the standard dataset:
My training command: And now I am training attendgru model by using sbt dataset (the root directory changes to the sbt) without modifying any other code. I do not know the result until 6 hours later but I am worried about that my method is incorrect. |
Hi, I am really interested in this ICSE paper published by you guys. Recently I have planned to reproduce the experiments but I face some difficulties.
I still plan to do the challenge data set, but I do not find anything on website given by you. I only find a data set named "sbt", which only contains coms.tok and dats.tok. In your paper, you say "The challenge dataset contains two elements for each
method: 1) the pre-processed comment, and 2) the SBT-AO representation of the Java code". So I guess "sbt" is the challenge data set. If I make a mistake, could you please tell me where to download the challenge data set, thx.
I see that you guys provide the final ast-attendgru trained model file (.h5) for both standard and challenge data. But to load the model acquires corresponding history configuration file, which I do not find. So I cannot do the prediction for the next step.
My development environment:
The text was updated successfully, but these errors were encountered: