Skip to content

out of memory even with multi-card training #120

Answered by ilyes319
YuanbinLiu asked this question in Q&A
Discussion options

You must be logged in to vote

Closing this as duplicate of #10, please use the PR #105 for now. For now you can either use a larger GPU or switch to a smaller model/batch size. Reducing the number of channels significantly helps with memory.

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by ilyes319
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants
Converted from issue

This discussion was converted from issue #118 on June 16, 2023 14:29.