Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[enhancement]: ability to bypass/disable a lora without having to unwire it from a node graph #3770

Open
1 task done
skunkworxdark opened this issue Jul 14, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@skunkworxdark
Copy link
Contributor

Is there an existing issue for this?

  • I have searched the existing issues

Contact Details

No response

What should this feature add?

Add a bypass option to the load lora node. So it can be disabled for an invocation without having to remove it from the node graph. This would be a short term hack to relieve a pain point some users are having with the node editor, as it is a long winded and a bit of a pain to wire and unwire a lora node. In the longer term either a global bypass or disable option on all nodes or some kind of sub node graph where adding removing and reordering loras is much simpler.

Alternatives

No response

Aditional Content

Here is a quick mockup I made with some code changes.

image

code changes to the LoraLoaderInvocation class in the model.py file

new bypass: bool input parameter to enable the bool toggle on th UI

        default=False,
        description="Bypass the lora and pass the unet and clip through unaltered")

Conditionally add the loara to the unet and clip depending on the bypass bool

            if self.bypass:
                output.unet = self.unet
            else:
                output.unet = copy.deepcopy(self.unet)
                output.unet.loras.append(
                    LoraInfo(
                        base_model=base_model,
                        model_name=lora_name,
                        model_type=ModelType.Lora,
                        submodel=None,
                        weight=self.weight,
                    )
                )

        if self.clip is not None:
            if self.bypass:
                output.clip = self.clip
            else:
                output.clip = copy.deepcopy(self.clip)
                output.clip.loras.append(
                    LoraInfo(
                        base_model=base_model,
                        model_name=lora_name,
                        model_type=ModelType.Lora,
                        submodel=None,
                        weight=self.weight,
                    )
                )
@skunkworxdark skunkworxdark added the enhancement New feature or request label Jul 14, 2023
@dias-bakhtiyarov
Copy link

Would really appreciate having the bypass added to all nodes that may be bypassed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants