You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:05<00:00, 1.74s/it]
Some parameters are on the meta device because they were offloaded to the cpu.
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(
Some parameters are on the meta device because they were offloaded to the cpu.
Inference: 0%| | 0/100 [00:00<?, ?it/s/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/configuration_utils.py:590: UserWarning: do_sample is set to False. However, temperature is set to 0.8 -- this flag is only used in sample-based generation modes. You should set do_sample=True or unset temperature.
warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/configuration_utils.py:595: UserWarning: do_sample is set to False. However, top_p is set to 0.9 -- this flag is only used in sample-based generation modes. You should set do_sample=True or unset top_p.
warnings.warn(
Generation process: 0%| | 0/1 [00:02<?, ?it/s]
Inference: 0%| | 0/100 [00:03<?, ?it/s]
Traceback (most recent call last):
File "/root/autodl-tmp/FlashRAG/examples/methods/run_exp.py", line 589, in
func(args)
File "/root/autodl-tmp/FlashRAG/examples/methods/run_exp.py", line 250, in retrobust
result = pipeline.run(test_data, pred_process_fun=selfask_pred_parse)
File "/root/autodl-tmp/FlashRAG/flashrag/pipeline/active_pipeline.py", line 902, in run
self.run_item(item)
File "/root/autodl-tmp/FlashRAG/flashrag/pipeline/active_pipeline.py", line 847, in run_item
gen_out = self.generator.generate(input_prompt, stop=["Context:", "#", stop_condition])[0]
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/root/autodl-tmp/FlashRAG/flashrag/generator/generator.py", line 432, in generate
outputs = self.model.generate(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/utils.py", line 2215, in generate
result = self._sample(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/utils.py", line 3206, in _sample
outputs = self(**model_inputs, return_dict=True)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 170, in new_forward
output = module._old_forward(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 1190, in forward
outputs = self.model(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 945, in forward
layer_outputs = decoder_layer(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 170, in new_forward
output = module._old_forward(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 673, in forward
hidden_states = self.input_layernorm(hidden_states)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 165, in new_forward
args, kwargs = module._hf_hook.pre_forward(module, *args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 110, in pre_forward
args, kwargs = hook.pre_forward(module, *args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 355, in pre_forward
set_module_tensor_to_device(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/utils/modeling.py", line 329, in set_module_tensor_to_device
new_value = value.to(device)
NotImplementedError: Cannot copy out of meta tensor; no data!
The text was updated successfully, but these errors were encountered:
python run_exp.py --method_name 'ret-robust' \
Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:05<00:00, 1.74s/it]
Some parameters are on the meta device because they were offloaded to the cpu.
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.34.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.35.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.36.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.37.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.38.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.q_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.q_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.k_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.k_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.v_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.v_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.o_proj.lora_A.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py:2400: UserWarning: for model.layers.39.self_attn.o_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)warnings.warn(
Some parameters are on the meta device because they were offloaded to the cpu.
Inference: 0%| | 0/100 [00:00<?, ?it/s/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/configuration_utils.py:590: UserWarning:
do_sample
is set toFalse
. However,temperature
is set to0.8
-- this flag is only used in sample-based generation modes. You should setdo_sample=True
or unsettemperature
.warnings.warn(
/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/configuration_utils.py:595: UserWarning:
do_sample
is set toFalse
. However,top_p
is set to0.9
-- this flag is only used in sample-based generation modes. You should setdo_sample=True
or unsettop_p
.warnings.warn(
Generation process: 0%| | 0/1 [00:02<?, ?it/s]
Inference: 0%| | 0/100 [00:03<?, ?it/s]
Traceback (most recent call last):
File "/root/autodl-tmp/FlashRAG/examples/methods/run_exp.py", line 589, in
func(args)
File "/root/autodl-tmp/FlashRAG/examples/methods/run_exp.py", line 250, in retrobust
result = pipeline.run(test_data, pred_process_fun=selfask_pred_parse)
File "/root/autodl-tmp/FlashRAG/flashrag/pipeline/active_pipeline.py", line 902, in run
self.run_item(item)
File "/root/autodl-tmp/FlashRAG/flashrag/pipeline/active_pipeline.py", line 847, in run_item
gen_out = self.generator.generate(input_prompt, stop=["Context:", "#", stop_condition])[0]
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/root/autodl-tmp/FlashRAG/flashrag/generator/generator.py", line 432, in generate
outputs = self.model.generate(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/utils.py", line 2215, in generate
result = self._sample(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/generation/utils.py", line 3206, in _sample
outputs = self(**model_inputs, return_dict=True)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 170, in new_forward
output = module._old_forward(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 1190, in forward
outputs = self.model(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 945, in forward
layer_outputs = decoder_layer(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 170, in new_forward
output = module._old_forward(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 673, in forward
hidden_states = self.input_layernorm(hidden_states)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 165, in new_forward
args, kwargs = module._hf_hook.pre_forward(module, *args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 110, in pre_forward
args, kwargs = hook.pre_forward(module, *args, **kwargs)
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/hooks.py", line 355, in pre_forward
set_module_tensor_to_device(
File "/root/miniconda3/envs/Inter/lib/python3.9/site-packages/accelerate/utils/modeling.py", line 329, in set_module_tensor_to_device
new_value = value.to(device)
NotImplementedError: Cannot copy out of meta tensor; no data!
The text was updated successfully, but these errors were encountered: