-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing whl nvmitten-0.1.3-cp38-cp38-linux_x86_64.whl when running bert-99 or rnnt #1197
Comments
I have the same issue (mlcommons/inference#1679). I'm working with arm architecture so it fails. It seems mitten is open source now, so if you |
I'll give that a try. Thanks I was wondering if NVIDA/mitten was the source. You have confirmed it. |
yes, nvmitten is currently supported only via |
Closing this issue for now. Please reopen if required. |
I did try running it with cm docker instead of cm run but the issue still persists. |
When trying to run bert-99 or rnnt inference. always fails with:
/root/cm/bin/python3 -m pip install "/opt/nvmitten-0.1.3-cp38-cp38-linux_x86_64.whl"
WARNING: Requirement '/opt/nvmitten-0.1.3-cp38-cp38-linux_x86_64.whl' looks like a filename, but the file does not exist
ERROR: nvmitten-0.1.3-cp38-cp38-linux_x86_64.whl is not a supported wheel on this platform.
Cmd line:
cm run script --tags=run-mlperf,inference,_r3.0,_performance-only,_short --division=closed --category=datacenter --device=cuda --model=bert-99 --precision=float32 --implementation=nvidia --backend=tensorrt --scenario=Offline --execution_mode=test --power=no --adr.python.version_min=3.8 --clean --compliance=yes --quiet --time
Here is the part of thel output: when it fails.
* cm run script "get sut configs"
! load /root/CM/repos/local/cache/b5cfc33b4e7546d7/cm-cached-state.json
Using MLCommons Inference source from '/root/CM/repos/local/cache/a867d6853aeb4402/inference'
No target_qps specified. Using 1 as target_qps
Output Dir: '/root/test_results/vm6-nvidia_original-gpu-tensorrt-vdefault-default_config/gptj-99/offline/performance/run_1'
gptj.Offline.target_qps = 1
gptj.Offline.max_query_count = 10
gptj.Offline.min_query_count = 10
gptj.Offline.min_duration = 0
/root/cm/bin/python3 -m pip install "/opt/nvmitten-0.1.3-cp38-cp38-linux_x86_64.whl"
WARNING: Requirement '/opt/nvmitten-0.1.3-cp38-cp38-linux_x86_64.whl' looks like a filename, but the file does not exist
ERROR: nvmitten-0.1.3-cp38-cp38-linux_x86_64.whl is not a supported wheel on this platform.
CM error: Portable CM script failed (name = get-generic-python-lib, return code = 256)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Note that it may be a portability issue of a third-party tool or a native script
wrapped and unified by this automation recipe (CM script). In such case,
please report this issue with a full log at "https://github.com/mlcommons/ck".
The CM concept is to collaboratively fix such issues inside portable CM scripts
to make existing tools and native scripts more portable, interoperable
and deterministic. Thank you!
The text was updated successfully, but these errors were encountered: