Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added transformer_text_generation #276

Open
wants to merge 11 commits into
base: main
Choose a base branch
from

Conversation

jmamou
Copy link

@jmamou jmamou commented Sep 1, 2021

No description provided.

Copy link
Contributor

@KentonMurray KentonMurray left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see a Robustness Evaluation in the README.md. Per the organizers e-mail, can you run evaluate.py?

@maobedkova
Copy link
Contributor

This type of augmentation might be risky to apply under some conditions so in the README it would be good to add "What are the limitations of this transformation?" section and explicitly add domain constraints, language constraints and so on.

@maobedkova
Copy link
Contributor

Potentially generation model could output top n predictions instead of one

@jmamou
Copy link
Author

jmamou commented Sep 19, 2021

I don't see a Robustness Evaluation in the README.md. Per the organizers e-mail, can you run evaluate.py?

Concerning the robustness evaluation, do you you mean running code from https://github.com/GEM-benchmark/NL-Augmenter/tree/main/evaluation?

@jmamou
Copy link
Author

jmamou commented Sep 19, 2021

Potentially generation model could output top n predictions instead of one

that's correct, this is currently supported by our code by setting num_return_sequences parameter of generate method to n.
Adding it explicitly to the README

@jmamou jmamou closed this Sep 19, 2021
@jmamou jmamou deleted the transformer_text_generation branch September 19, 2021 20:31
@jmamou jmamou restored the transformer_text_generation branch September 19, 2021 20:32
@jmamou jmamou reopened this Sep 19, 2021
@KentonMurray
Copy link
Contributor

I don't see a Robustness Evaluation in the README.md. Per the organizers e-mail, can you run evaluate.py?

Concerning the robustness evaluation, do you you mean running code from https://github.com/GEM-benchmark/NL-Augmenter/tree/main/evaluation?

Kaustubh sent an e-mail about the PR process. In it, it says "To evaluate your transformation, run the evaluate.py command and add the results of the models in your transformation's readme under the "Robustness Evaluation" section." That's all I am asking for - to add that section the README.

Copy link
Collaborator

@aadesh11 aadesh11 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add your transformation name to the test/mapper.py in the right dictionary for the pytest to pick up your test.json. By default, we're testing only light transformations and filters.

@jmamou jmamou requested a review from aadesh11 October 5, 2021 09:32
@jmamou
Copy link
Author

jmamou commented Oct 18, 2021

Please add your transformation name to the test/mapper.py in the right dictionary for the pytest to pick up your test.json. By default, we're testing only light transformations and filters.

Done!

@jmamou
Copy link
Author

jmamou commented Nov 17, 2021

Hi
do you need any additional change before approval?
Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants