You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add a bypass option to the load lora node. So it can be disabled for an invocation without having to remove it from the node graph. This would be a short term hack to relieve a pain point some users are having with the node editor, as it is a long winded and a bit of a pain to wire and unwire a lora node. In the longer term either a global bypass or disable option on all nodes or some kind of sub node graph where adding removing and reordering loras is much simpler.
Alternatives
No response
Aditional Content
Here is a quick mockup I made with some code changes.
code changes to the LoraLoaderInvocation class in the model.py file
new bypass: bool input parameter to enable the bool toggle on th UI
default=False,
description="Bypass the lora and pass the unet and clip through unaltered")
Conditionally add the loara to the unet and clip depending on the bypass bool
if self.bypass:
output.unet = self.unet
else:
output.unet = copy.deepcopy(self.unet)
output.unet.loras.append(
LoraInfo(
base_model=base_model,
model_name=lora_name,
model_type=ModelType.Lora,
submodel=None,
weight=self.weight,
)
)
if self.clip is not None:
if self.bypass:
output.clip = self.clip
else:
output.clip = copy.deepcopy(self.clip)
output.clip.loras.append(
LoraInfo(
base_model=base_model,
model_name=lora_name,
model_type=ModelType.Lora,
submodel=None,
weight=self.weight,
)
)
The text was updated successfully, but these errors were encountered:
Is there an existing issue for this?
Contact Details
No response
What should this feature add?
Add a bypass option to the load lora node. So it can be disabled for an invocation without having to remove it from the node graph. This would be a short term hack to relieve a pain point some users are having with the node editor, as it is a long winded and a bit of a pain to wire and unwire a lora node. In the longer term either a global bypass or disable option on all nodes or some kind of sub node graph where adding removing and reordering loras is much simpler.
Alternatives
No response
Aditional Content
Here is a quick mockup I made with some code changes.
code changes to the LoraLoaderInvocation class in the model.py file
new bypass: bool input parameter to enable the bool toggle on th UI
Conditionally add the loara to the unet and clip depending on the bypass bool
The text was updated successfully, but these errors were encountered: