Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: cannot import name 'DPOConfig' from 'trl' #1642

Closed
AswiniNLP opened this issue May 14, 2024 · 20 comments
Closed

ImportError: cannot import name 'DPOConfig' from 'trl' #1642

AswiniNLP opened this issue May 14, 2024 · 20 comments

Comments

@AswiniNLP
Copy link

AswiniNLP commented May 14, 2024

Screenshot from 2024-05-14 14-48-09
from trl import DPOConfig
I am not able to import DPOConfig, however, I can import DPOTrainer

can we use DPO on flanT5 , automodelforseq2seqLM ?

@younesbelkada
Copy link
Contributor

Hi @AswiniNLP
What is your TRL version? Can you try to install the latest TRL version pip install -U trl

@AswiniNLP
Copy link
Author

AswiniNLP commented May 14, 2024

Dear @younesbelkada,
It is not working. Same error. TRL version is 0.8.6

@AswiniNLP
Copy link
Author

Hi how to apply dpo on flanT5 ?

@trangtv57
Copy link

hey, the same issuse, I don't sure what version trl need to use so I can run dpo trainer script in example, so bad.
tks

@AswiniNLP
Copy link
Author

AswiniNLP commented May 15, 2024 via email

@AswiniNLP
Copy link
Author

How to run dpo for flant5

@trangtv57
Copy link

install from source may help:
pip install git+https://github.com/huggingface/trl.git

@AswiniNLP
Copy link
Author

AswiniNLP commented May 15, 2024 via email

@AswiniNLP
Copy link
Author

AswiniNLP commented May 15, 2024 via email

@AswiniNLP
Copy link
Author

Dear @younesbelkada, It is not working. Same error. TRL version is 0.8.6

Could you please help me know how to apply DPO on the seq to seq model through the DPO trainer?

@JhonDan1999
Copy link

the same issue here

Screenshot 2024-05-25 at 12 25 13 AM

@younesbelkada
Copy link
Contributor

Hi there!
I am sure you are facing a weird env conflict issue, make sure to use the latest TRL from pypi and refresh the kernels if you are using a google colab environment pip install -U trl

@JhonDan1999
Copy link

pip install -U trl

thank you @younesbelkada I already did !pip install --upgrade trl but the issue is still there

@HarryMayne
Copy link

install from source may help: pip install git+https://github.com/huggingface/trl.git

Installing 0.8.7 dev version from source works for me. Seems to work for 0.8.7 dev but not for 0.8.6. (latest release)

@younesbelkada
Copy link
Contributor

let me try again and i will report back here

@younesbelkada
Copy link
Contributor

Sorry for the confusion, indeed DPOConfig and SFTConfig are available features on main only, so as @HarryMayne pointed out, you need to install TRL from source. I will make a release soon on pypi to include that + many other bugfixes

@kevinjesse
Copy link

Installing from source on main does not resolve the missing DPOConfig. Even if DPOConfig.py is in trl/trl/trainer/, it is missing from the init.py.

@meixtan
Copy link

meixtan commented Jun 5, 2024

Installing from the source did not solve this for me. Same issue.

Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

@github-actions github-actions bot closed this as completed Jul 7, 2024
@Rhitabrat
Copy link

not working

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants