Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

复现ret-robust框架遇到问题,报错NotImplementedError: Cannot copy out of meta tensor; no data! #99

Open
gao18835258627 opened this issue Nov 9, 2024 · 0 comments

Comments

@gao18835258627
Copy link

python run_exp.py --method_name 'ret-robust' \

                                     --split 'test' \
                                     --dataset_name 'nq' \
                                     --gpu_id '0'

Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:05<00:00, 1.74s/it]
Some parameters are on the meta device because they were offloaded to the cpu.
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
Some parameters are on the meta device because they were offloaded to the cpu.
Inference: 0%| | 0/100 [00:00<?, ?it/s/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/configuration_utils.py:590: UserWarning: do_sample is set to False. However, temperature is set to 0.8 -- this flag is only used in sample-based generation modes. You should set do_sample=True or unset temperature.
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/configuration_utils.py:595: UserWarning: do_sample is set to False. However, top_p is set to 0.9 -- this flag is only used in sample-based generation modes. You should set do_sample=True or unset top_p.
warnings.warn(
Generation process: 0%| | 0/1 [00:02<?, ?it/s]
Inference: 0%| | 0/100 [00:03<?, ?it/s]
Traceback (most recent call last):
File "/root/autodl-tmp/FlashRAG/examples/methods/run_exp.py", line 589, in
func(args)
File "/root/autodl-tmp/FlashRAG/examples/methods/run_exp.py", line 250, in retrobust
result = pipeline.run(test_data, pred_process_fun=selfask_pred_parse)
File "/root/autodl-tmp/FlashRAG/flashrag/pipeline/active_pipeline.py", line 902, in run
self.run_item(item)
File "/root/autodl-tmp/FlashRAG/flashrag/pipeline/active_pipeline.py", line 847, in run_item
gen_out = self.generator.generate(input_prompt, stop=["Context:", "#", stop_condition])[0]
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/root/autodl-tmp/FlashRAG/flashrag/generator/generator.py", line 432, in generate
outputs = self.model.generate(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/utils.py", line 2215, in generate
result = self._sample(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/utils.py", line 3206, in _sample
outputs = self(**model_inputs, return_dict=True)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 170, in new_forward
output = module._old_forward(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 1190, in forward
outputs = self.model(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 945, in forward
layer_outputs = decoder_layer(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 170, in new_forward
output = module._old_forward(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 673, in forward
hidden_states = self.input_layernorm(hidden_states)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 165, in new_forward
args, kwargs = module._hf_hook.pre_forward(module, *args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 110, in pre_forward
args, kwargs = hook.pre_forward(module, *args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 355, in pre_forward
set_module_tensor_to_device(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/utils/modeling.py", line 329, in set_module_tensor_to_device
new_value = value.to(device)
NotImplementedError: Cannot copy out of meta tensor; no data!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant