You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I running the code sample in README.md, I am suffering with this problem.
Traceback (most recent call last):
File "/home/npl/ViInfographicCaps/code/lmm_models/flamingoGithub/code/fla.py", line 120, in
model, image_processor, tokenizer = load_model_transform()
^^^^^^^^^^^^^^^^^^^^^^
File "/home/npl/ViInfographicCaps/code/lmm_models/flamingoGithub/code/fla.py", line 20, in load_model_transform
model, image_processor, tokenizer = create_model_and_transforms(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/npl/ViInfographicCaps/vir_env/openfla/lib/python3.12/site-packages/open_flamingo/src/factory.py", line 54, in create_model_and_transforms
lang_encoder = AutoModelForCausalLM.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/npl/ViInfographicCaps/vir_env/openfla/lib/python3.12/site-packages/transformers/models/auto/auto_factory.py", line 524, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/npl/ViInfographicCaps/vir_env/openfla/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 979, in from_pretrained
trust_remote_code = resolve_trust_remote_code(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/npl/ViInfographicCaps/vir_env/openfla/lib/python3.12/site-packages/transformers/dynamic_module_utils.py", line 626, in resolve_trust_remote_code
raise ValueError(
ValueError: The repository for anas-awadalla/mpt-1b-redpajama-200b contains custom code which must be executed to correctly load the model. You can inspect the repository content at https://hf.co/anas-awadalla/mpt-1b-redpajama-200b.
Please pass the argument trust_remote_code=True to allow custom code to be run.
When I pass the argument trust_remote_code=True to
File "/home/npl/ViInfographicCaps/code/lmm_models/flamingoGithub/code/fla.py", line 120, in
model, image_processor, tokenizer = load_model_transform()
^^^^^^^^^^^^^^^^^^^^^^
File "/home/npl/ViInfographicCaps/code/lmm_models/flamingoGithub/code/fla.py", line 20, in load_model_transform
model, image_processor, tokenizer = create_model_and_transforms(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/npl/ViInfographicCaps/vir_env/openfla/lib/python3.12/site-packages/open_flamingo/src/factory.py", line 60, in create_model_and_transforms
decoder_layers_attr_name = _infer_decoder_layers_attr_name(lang_encoder)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/npl/ViInfographicCaps/vir_env/openfla/lib/python3.12/site-packages/open_flamingo/src/factory.py", line 97, in _infer_decoder_layers_attr_name
raise ValueError(
ValueError: We require the attribute name for the nn.ModuleList in the decoder storing the transformer block layers. Please supply this string manually.
Is there any possible way to solve this problem.
The text was updated successfully, but these errors were encountered:
MrNquyen
changed the title
problem when load image using create_model_and_transforms()
problem when load model, image_processor, tokenizer using create_model_and_transforms()Sep 19, 2024
When I running the code sample in README.md, I am suffering with this problem.
When I pass the argument
trust_remote_code=True
toI found this error
Is there any possible way to solve this problem.
The text was updated successfully, but these errors were encountered: