out of memory even with multi-card training #120
-
Describe the bug Expected behavior |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
We are working on multi-GPU training in this PR: #105 |
Beta Was this translation helpful? Give feedback.
-
Closing this as duplicate of #10, please use the PR #105 for now. For now you can either use a larger GPU or switch to a smaller model/batch size. Reducing the number of channels significantly helps with memory. |
Beta Was this translation helpful? Give feedback.
Closing this as duplicate of #10, please use the PR #105 for now. For now you can either use a larger GPU or switch to a smaller model/batch size. Reducing the number of channels significantly helps with memory.