Replies: 1 comment
-
We don't support QAT with full training. Only mixed precision is supported |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Does llama-factory support quantization-aware-training (QAT) for SFT task on the full model?
Beta Was this translation helpful? Give feedback.
All reactions