Skip to content

Auto-enable PYTORCH_ENABLE_MPS_FALLBACK for ComfyUI compatibility#6

Draft
Copilot wants to merge 8 commits intomainfrom
copilot/fix-attributeerror-torch-mps
Draft

Auto-enable PYTORCH_ENABLE_MPS_FALLBACK for ComfyUI compatibility#6
Copilot wants to merge 8 commits intomainfrom
copilot/fix-attributeerror-torch-mps

Conversation

Copy link

Copilot AI commented Feb 13, 2026

ComfyUI custom node now sets PYTORCH_ENABLE_MPS_FALLBACK=1 to prevent crashes from unsupported MPS operations.

Changes

  • __init__.py: Set environment variable before torch import using setdefault() to allow user override
  • __init__.py: Display actual fallback status in console output
  • README.md: Remove manual environment variable setup instructions

Implementation

# Enable MPS fallback to CPU for unsupported operations
# This must be set BEFORE importing torch
os.environ.setdefault("PYTORCH_ENABLE_MPS_FALLBACK", "1")

import torch

Console output now shows:

✓ FP8 MPS Metal patch installed successfully!
PYTORCH_ENABLE_MPS_FALLBACK: Enabled
Backend: Native (torch.mps.compile_shader)

Users can override by setting PYTORCH_ENABLE_MPS_FALLBACK=0 before ComfyUI startup.

Original prompt

This section details on the original issue you should resolve

<issue_title>AttributeError: module 'torch.mps' has no attribute 'compile_shader'</issue_title>
<issue_description>

ComfyUI Error Report

Error Details

  • Node ID: 92:41
  • Node Type: SamplerCustomAdvanced
  • Exception Type: AttributeError
  • Exception Message: module 'torch.mps' has no attribute 'compile_shader'

Stack Trace

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ldm/lightricks/av_model.py", line 862, in forward
    return super().forward(
           ^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ldm/lightricks/model.py", line 745, in forward
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ldm/lightricks/model.py", line 792, in _forward
    x = self._process_transformer_blocks(
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ldm/lightricks/av_model.py", line 774, in _process_transformer_blocks
    vx, ax = block(
             ^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ldm/lightricks/av_model.py", line 236, in forward
    attn1_out = self.attn1(norm_vx, pe=v_pe, transformer_options=transformer_options)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ldm/lightricks/model.py", line 367, in forward
    q = self.to_q(x)
        ^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ops.py", line 880, in forward
    output = self.forward_comfy_cast_weights(input, compute_dtype)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ops.py", line 850, in forward_comfy_cast_weights
    x = self._forward(input, weight, bias)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Applications/ComfyUI.app/Contents/Resources/ComfyUI/comfy/ops.py", line 846, in _forward
    return torch.nn.functional.linear(input, weight, bias)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/comfy_kitchen/tensor/base.py", line 348, in __torch_dispatch__
    return op_handlers[parent_cls](qt, args, kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/rocco/_ART/AIGENERATIVE/ComfyUI/.venv/lib/python3.12/site-packages/comfy_kitchen/tensor/fp8.py", line 149, in _handle_fp8_linear
    return torch.nn.functional.linear(*dequantize_args((input_tensor, weight, bias)))
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^...

</details>



<!-- START COPILOT CODING AGENT SUFFIX -->

- Fixes audiohacking/fp8-mps-metal#5

<!-- START COPILOT CODING AGENT TIPS -->
---

💬 We'd love your input! Share your thoughts on Copilot coding agent in our [2 minute survey](https://gh.io/copilot-coding-agent-survey).

Copilot AI and others added 2 commits February 13, 2026 21:01
…sage

Co-authored-by: lmangani <1423657+lmangani@users.noreply.github.com>
Co-authored-by: lmangani <1423657+lmangani@users.noreply.github.com>
Copilot AI changed the title [WIP] Fix AttributeError in torch.mps module Add version check for torch.mps.compile_shader (PyTorch 2.10+) Feb 13, 2026
Copilot AI requested a review from lmangani February 13, 2026 21:04
Copilot AI and others added 2 commits February 14, 2026 02:08
Co-authored-by: lmangani <1423657+lmangani@users.noreply.github.com>
Co-authored-by: lmangani <1423657+lmangani@users.noreply.github.com>
Copilot AI changed the title Add version check for torch.mps.compile_shader (PyTorch 2.10+) Replace hard error with automatic backend fallback for PyTorch < 2.10 Feb 14, 2026
Copilot AI and others added 3 commits February 14, 2026 02:17
Co-authored-by: lmangani <1423657+lmangani@users.noreply.github.com>
Co-authored-by: lmangani <1423657+lmangani@users.noreply.github.com>
Co-authored-by: lmangani <1423657+lmangani@users.noreply.github.com>
Copilot AI changed the title Replace hard error with automatic backend fallback for PyTorch < 2.10 Auto-enable PYTORCH_ENABLE_MPS_FALLBACK for ComfyUI compatibility Feb 14, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants