-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ImportError: cannot import name 'DPOConfig' from 'trl' #1642
Comments
Hi @AswiniNLP |
Dear @younesbelkada, |
Hi how to apply dpo on flanT5 ? |
hey, the same issuse, I don't sure what version trl need to use so I can run dpo trainer script in example, so bad. |
Dear, DPOconfig is not working
…On Wed, 15 May 2024, 20:04 trangtv57, ***@***.***> wrote:
hey, the same issuse, I don't sure what version trl need to use so I can
run dpo trainer script in example, so bad.
tks
—
Reply to this email directly, view it on GitHub
<#1642 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A7RYH2Y7H7K6D6QVW7TMTETZCNXA7AVCNFSM6AAAAABHVX3O56VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMJSG4YTMNRUGA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
How to run dpo for flant5 |
install from source may help: |
Tried not working
…On Wed, 15 May 2024, 20:33 trangtv57, ***@***.***> wrote:
install from source may help:
pip install git+https://github.com/huggingface/trl.git
—
Reply to this email directly, view it on GitHub
<#1642 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A7RYH25L5BEJ7C75GOVGZ7LZCN2MVAVCNFSM6AAAAABHVX3O56VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMJSG44TOMRRGE>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Can we use DPO on the top of flant5?
On Wed, 15 May 2024, 20:34 Aswini Kumar Padhi, <
***@***.***> wrote:
… Tried not working
On Wed, 15 May 2024, 20:33 trangtv57, ***@***.***> wrote:
> install from source may help:
> pip install git+https://github.com/huggingface/trl.git
>
> —
> Reply to this email directly, view it on GitHub
> <#1642 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/A7RYH25L5BEJ7C75GOVGZ7LZCN2MVAVCNFSM6AAAAABHVX3O56VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMJSG44TOMRRGE>
> .
> You are receiving this because you were mentioned.Message ID:
> ***@***.***>
>
|
Could you please help me know how to apply DPO on the seq to seq model through the DPO trainer? |
Hi there! |
thank you @younesbelkada I already did |
Installing 0.8.7 dev version from source works for me. Seems to work for 0.8.7 dev but not for 0.8.6. (latest release) |
let me try again and i will report back here |
Sorry for the confusion, indeed |
Installing from source on main does not resolve the missing DPOConfig. Even if DPOConfig.py is in trl/trl/trainer/, it is missing from the init.py. |
Installing from the source did not solve this for me. Same issue. |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. |
not working |
from trl import DPOConfig
I am not able to import DPOConfig, however, I can import DPOTrainer
can we use DPO on flanT5 , automodelforseq2seqLM ?
The text was updated successfully, but these errors were encountered: