Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

crash with newest comfyui #65

Open
akedia opened this issue Dec 18, 2024 · 4 comments
Open

crash with newest comfyui #65

akedia opened this issue Dec 18, 2024 · 4 comments

Comments

@akedia
Copy link

akedia commented Dec 18, 2024

Manager

1

SamplerCustomAdvanced
DoubleStreamBlock.forward() got an unexpected keyword argument 'ref_config'

ComfyUI Error Report

Error Details

  • Node ID: 45
  • Node Type: SamplerCustomAdvanced
  • Exception Type: TypeError
  • Exception Message: DoubleStreamBlock.forward() got an unexpected keyword argument 'ref_config'

Stack Trace

  File "/root/autodl-tmp/ComfyUI/execution.py", line 328, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/execution.py", line 203, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)

  File "/root/autodl-tmp/ComfyUI/execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy_extras/nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 897, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 866, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 850, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 707, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Fluxtapoz/nodes/rf_edit_sampler_nodes.py", line 49, in sample_forward
    pred = model(x, s_in * sigma, **extra_args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 379, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 832, in __call__
    return self.predict_noise(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 835, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 359, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 195, in calc_cond_batch
    return executor.execute(model, conds, x_in, timestep, model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 308, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/model_base.py", line 129, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/model_base.py", line 158, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Fluxtapoz/flux/model.py", line 187, in forward
    out = self.forward_orig(img, img_ids_orig, context, txt_ids, timestep, y, guidance, control, transformer_options=transformer_options, ref_config=ref_config)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/custom_nodes/ComfyUI-Fluxtapoz/flux/model.py", line 63, in forward_orig
    img, txt = block(img=img, txt=txt, vec=vec, pe=pe, ref_config=ref_config, timestep=timesteps, transformer_options=transformer_options)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/_dynamo/eval_frame.py", line 465, in _fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

System Information

  • ComfyUI Version: v0.3.7-51-g4c5c4dd
  • Arguments: main.py --port=6019 --fast --use-sage-attention
  • OS: posix
  • Python Version: 3.11.9 (main, Apr 19 2024, 16:48:06) [GCC 11.2.0]
  • Embedded Python: false
  • PyTorch Version: 2.5.1+cu124

Devices

@logtd
Copy link
Owner

logtd commented Dec 18, 2024

Ah, thanks for reporting. I'll take a look

@logtd
Copy link
Owner

logtd commented Dec 18, 2024

If you update now it should be fixed

@akedia
Copy link
Author

akedia commented Dec 18, 2024

thanks for the fix, still some problem

ComfyUI Error Report

Error Details

  • Node ID: 238
  • Node Type: SamplerCustomAdvanced
  • Exception Type: AttributeError
  • Exception Message: 'SingleStreamBlock' object has no attribute 'linear1'

Stack Trace

  File "/root/autodl-tmp/ComfyUI/execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)

  File "/root/autodl-tmp/ComfyUI/execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy_extras/nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/samplers.py", line 730, in sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/sampler_helpers.py", line 62, in prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required, minimum_memory_required=minimum_memory_required)

  File "/root/autodl-tmp/ComfyUI/comfy/model_management.py", line 512, in load_models_gpu
    unload_model_clones(loaded_model.model, unload_weights_only=True, force_unload=False) #unload clones where the weights are different
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/model_management.py", line 416, in unload_model_clones
    current_loaded_models.pop(i).model_unload(unpatch_weights=unload_weight)

  File "/root/autodl-tmp/ComfyUI/comfy/model_management.py", line 347, in model_unload
    self.model.unpatch_model(self.model.offload_device, unpatch_weights=unpatch_weights)

  File "/root/autodl-tmp/ComfyUI/comfy/model_patcher.py", line 499, in unpatch_model
    comfy.utils.set_attr_param(self.model, k, bk.weight)

  File "/root/autodl-tmp/ComfyUI/comfy/utils.py", line 607, in set_attr_param
    return set_attr(obj, attr, torch.nn.Parameter(value, requires_grad=False))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/ComfyUI/comfy/utils.py", line 601, in set_attr
    obj = getattr(obj, name)
          ^^^^^^^^^^^^^^^^^^

  File "/root/autodl-tmp/envs/py311/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1931, in __getattr__
    raise AttributeError(

System Information

  • ComfyUI Version: v0.3.6-1-g2d5b3e0
  • Arguments: main.py --port=6019 --fast --use-sage-attention
  • OS: posix
  • Python Version: 3.11.9 (main, Apr 19 2024, 16:48:06) [GCC 11.2.0]
  • Embedded Python: false
  • PyTorch Version: 2.5.1+cu124

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 4090 D : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 25386352640
    • VRAM Free: 3014173746
    • Torch VRAM Total: 21877489664
    • Torch VRAM Free: 2925618

## Additional Context
(Please add any additional context or steps to reproduce the error here)

it seems to happen after flux model is compiled with KJnodes

@logtd
Copy link
Owner

logtd commented Dec 19, 2024

Ah, yeah it's unlikely compilation will work on these without some custom logic

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants