-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cpu resources costs #724
Labels
help wanted
Extra attention is needed
Comments
how is gpu usage |
make sure installation step for the following is succeed
then use the following command to check whether cuda is detected
|
gpu memory usage is ~3GB |
yes, im pretty sure : torch.cuda.is_avaliable() is True |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Checks
Environment Details
a100-gpu cuda12.2
Steps to Reproduce
Why does using GPU for inference consume a lot of CPU resources, even though the printed device in load_checkpoint and load_vocoder is 'cuda'?
✔️ Expected Behavior
load model for once, and load wav_scp one line by one, each run infer_process
❌ Actual Behavior
it costs much 1000%cpu resources for 1 thread
The text was updated successfully, but these errors were encountered: