Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LFS issue #3

Open
tirans opened this issue Oct 19, 2024 · 0 comments
Open

LFS issue #3

tirans opened this issue Oct 19, 2024 · 0 comments

Comments

@tirans
Copy link

tirans commented Oct 19, 2024

Hi, the repo is defined as lfs but git lfs pull created :
batch response: This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access.
error: failed to fetch some objects from 'https://github.com/HemantKArya/Melodfy.git/info/lfs'
My understanding is that the downloadModel function is a workaround, but this doesn't work.
/Users/tirane/PycharmProjects/Melodfy/venv/bin/python /Users/tirane/github/Melodfy/main.py
Model already present
Loading model...
Error found in moderl...
Model downloaded
Traceback (most recent call last):
File "/Users/tirane/github/Melodfy/main.py", line 49, in
ort_session = InferenceSession(model_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/tirane/PycharmProjects/Melodfy/venv/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 383, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/Users/tirane/PycharmProjects/Melodfy/venv/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 424, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /Users/tirane/github/Melodfy/models/model.onnx failed:/Users/runner/work/1/s/onnxruntime/core/graph/model.cc:134 onnxruntime::Model::Model(onnx::ModelProto &&, const onnxruntime::PathString &, const onnxruntime::IOnnxRuntimeOpSchemaRegistryList *, const logging::Logger &, const onnxruntime::ModelOptions &) ModelProto does not have a graph.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Users/tirane/github/Melodfy/main.py", line 53, in
ort_session = InferenceSession(model_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/tirane/PycharmProjects/Melodfy/venv/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 383, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/Users/tirane/PycharmProjects/Melodfy/venv/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 424, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /Users/tirane/github/Melodfy/models/model.onnx failed:/Users/runner/work/1/s/onnxruntime/core/graph/model.cc:134 onnxruntime::Model::Model(onnx::ModelProto &&, const onnxruntime::PathString &, const onnxruntime::IOnnxRuntimeOpSchemaRegistryList *, const logging::Logger &, const onnxruntime::ModelOptions &) ModelProto does not have a graph.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant