-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference Speed Test? #4
Comments
Hi @ildoonet, I'll take a look when I get time :) |
I finally got around to doing some inference on ShuffleNet today. And it is definitely far too slow. Any ideas on how to speed it up? I suspect the snail-like speed is due to the frequent channel shuffling. |
@gngdb if you have any ideas on how to speed it up in PyTorch, would love to know. I can't imagine doing a full training run at this speed. Speeding up would drastically help you with training too |
What version of PyTorch are you running? The speed of grouped convolutions increased a lot in the most recent versions. |
I'm running PyTorch 0.3.0 with CUDA. How long does it take for you to do one inference on the cat image? It takes probably 30 seconds for me. |
The entire script takes about 400ms for me to run, and the actual inference step |
For completeness, I was running with pytorch version 0.4.0, and Also, here are the CPU details:
With an old conda env on pytorch version 0.2.0, it took 150ms for inference and 350ms for the whole script. |
Hmm okay. I guess there's no need to improve speed if it works well enough. I'll figure out what the problem is on my end. |
@jaxony HI,have you solved it? |
It would be great if you test your code to check the inference speed.
The text was updated successfully, but these errors were encountered: