Skip to content

Failed to import diffusers.models.autoencoders.autoencoder_kl #3477

@meokinh000

Description

@meokinh000

i was attempting to run kohya_ss on google colab and error show up when training lora, can someone help me?(kohya_ss run normal on my local computer)

16:02:53-198106 INFO     Regularization factor: 1                               
16:02:53-198892 INFO     Train batch size: 5                                    
16:02:53-199628 INFO     Gradient accumulation steps: 1                         
16:02:53-200408 INFO     Epoch: 10                                              
16:02:53-201225 INFO     max_train_steps (350 / 5 / 1 * 10 * 1) = 700           
16:02:53-202158 INFO     stop_text_encoder_training = 0                         
16:02:53-202920 INFO     lr_warmup_steps = 0.1                                  
16:02:53-203901 INFO     Effective Learning Rate Configuration (based on GUI    
                         settings):                                             
16:02:53-204832 INFO       - Main LR (for optimizer & fallback): 5.00e-04       
16:02:53-205764 INFO       - Text Encoder (Primary/CLIP) Effective LR: 5.00e-04 
                         (Specific Value)                                       
16:02:53-206786 INFO       - Text Encoder (T5XXL, if applicable) Effective LR:  
                         5.00e-04 (Inherited from Primary TE LR)                
16:02:53-207734 INFO       - U-Net Effective LR: 5.00e-04 (Specific Value)      
16:02:53-208577 INFO     Note: These LRs reflect the GUI's direct settings.     
                         Advanced options in sd-scripts (e.g., block LRs, LoRA+)
                         can further modify rates for specific layers.          
16:02:53-221248 INFO     Saving training config to /content/drive/MyDrive/New   
                         folder (4)/model/cat_20260112-160253.json...     
16:02:53-229436 INFO     Executing command: /usr/local/bin/accelerate launch    
                         --dynamo_backend no --dynamo_mode default              
                         --mixed_precision fp16 --num_processes 1 --num_machines
                         1 --num_cpu_threads_per_process 2                      
                         /content/kohya_ss/sd-scripts/sdxl_train_network.py     
                         --config_file /content/drive/MyDrive/New folder        
                         (4)/model/config_lora-20260112-160253.toml             
Traceback (most recent call last):
  File "/usr/local/lib/python3.12/dist-packages/diffusers/utils/import_utils.py", line 920, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 999, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "/usr/local/lib/python3.12/dist-packages/diffusers/loaders/peft.py", line 38, in <module>
    from .lora_base import _fetch_state_dict, _func_optionally_disable_offloading
  File "/usr/local/lib/python3.12/dist-packages/diffusers/loaders/lora_base.py", line 56, in <module>
    from peft.tuners.tuners_utils import BaseTunerLayer
  File "/usr/local/lib/python3.12/dist-packages/peft/__init__.py", line 17, in <module>
    from .auto import (
  File "/usr/local/lib/python3.12/dist-packages/peft/auto.py", line 32, in <module>
    from .peft_model import (
  File "/usr/local/lib/python3.12/dist-packages/peft/peft_model.py", line 42, in <module>
    from peft.tuners.lora.variants import get_alora_offsets_for_forward, get_alora_offsets_for_generate
  File "/usr/local/lib/python3.12/dist-packages/peft/tuners/__init__.py", line 15, in <module>
    from .adalora import AdaLoraConfig, AdaLoraModel
  File "/usr/local/lib/python3.12/dist-packages/peft/tuners/adalora/__init__.py", line 18, in <module>
    from .config import AdaLoraConfig
  File "/usr/local/lib/python3.12/dist-packages/peft/tuners/adalora/config.py", line 19, in <module>
    from peft.tuners.lora import LoraConfig
  File "/usr/local/lib/python3.12/dist-packages/peft/tuners/lora/__init__.py", line 23, in <module>
    from .model import LoraModel
  File "/usr/local/lib/python3.12/dist-packages/peft/tuners/lora/model.py", line 26, in <module>
    from transformers.modeling_layers import GradientCheckpointingLayer
ModuleNotFoundError: No module named 'transformers.modeling_layers'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.12/dist-packages/diffusers/utils/import_utils.py", line 920, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1310, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 999, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "/usr/local/lib/python3.12/dist-packages/diffusers/models/autoencoders/__init__.py", line 1, in <module>
    from .autoencoder_asym_kl import AsymmetricAutoencoderKL
  File "/usr/local/lib/python3.12/dist-packages/diffusers/models/autoencoders/autoencoder_asym_kl.py", line 23, in <module>
    from .vae import DecoderOutput, DiagonalGaussianDistribution, Encoder, MaskConditionDecoder
  File "/usr/local/lib/python3.12/dist-packages/diffusers/models/autoencoders/vae.py", line 25, in <module>
    from ..unets.unet_2d_blocks import (
  File "/usr/local/lib/python3.12/dist-packages/diffusers/models/unets/__init__.py", line 6, in <module>
    from .unet_2d import UNet2DModel
  File "/usr/local/lib/python3.12/dist-packages/diffusers/models/unets/unet_2d.py", line 24, in <module>
    from .unet_2d_blocks import UNetMidBlock2D, get_down_block, get_up_block
  File "/usr/local/lib/python3.12/dist-packages/diffusers/models/unets/unet_2d_blocks.py", line 36, in <module>
    from ..transformers.dual_transformer_2d import DualTransformer2DModel
  File "/usr/local/lib/python3.12/dist-packages/diffusers/models/transformers/__init__.py", line 6, in <module>
    from .cogvideox_transformer_3d import CogVideoXTransformer3DModel
  File "/usr/local/lib/python3.12/dist-packages/diffusers/models/transformers/cogvideox_transformer_3d.py", line 22, in <module>
    from ...loaders import PeftAdapterMixin
  File "<frozen importlib._bootstrap>", line 1412, in _handle_fromlist
  File "/usr/local/lib/python3.12/dist-packages/diffusers/utils/import_utils.py", line 910, in __getattr__
    module = self._get_module(self._class_to_module[name])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/diffusers/utils/import_utils.py", line 922, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import diffusers.loaders.peft because of the following error (look up to see its traceback):
No module named 'transformers.modeling_layers'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/content/kohya_ss/sd-scripts/sdxl_train_network.py", line 10, in <module>
    from library import sdxl_model_util, sdxl_train_util, strategy_base, strategy_sd, strategy_sdxl, train_util
  File "/content/kohya_ss/sd-scripts/library/sdxl_model_util.py", line 8, in <module>
    from diffusers import AutoencoderKL, EulerDiscreteScheduler, UNet2DConditionModel
  File "<frozen importlib._bootstrap>", line 1412, in _handle_fromlist
  File "/usr/local/lib/python3.12/dist-packages/diffusers/utils/import_utils.py", line 911, in __getattr__
    value = getattr(module, name)
            ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/diffusers/utils/import_utils.py", line 910, in __getattr__
    module = self._get_module(self._class_to_module[name])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/diffusers/utils/import_utils.py", line 922, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import diffusers.models.autoencoders.autoencoder_kl because of the following error (look up to see its traceback):
Failed to import diffusers.loaders.peft because of the following error (look up to see its traceback):
No module named 'transformers.modeling_layers'
Traceback (most recent call last):
  File "/usr/local/bin/accelerate", line 10, in <module>
    sys.exit(main())
             ^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/accelerate/commands/accelerate_cli.py", line 50, in main
    args.func(args)
  File "/usr/local/lib/python3.12/dist-packages/accelerate/commands/launch.py", line 1281, in launch_command
    simple_launcher(args)
  File "/usr/local/lib/python3.12/dist-packages/accelerate/commands/launch.py", line 869, in simple_launcher
    raise subprocess.CalledProcessError(returncode=process.returncode, cmd=cmd)
subprocess.CalledProcessError: Command '['/usr/bin/python3', '/content/kohya_ss/sd-scripts/sdxl_train_network.py', '--config_file', '/content/drive/MyDrive/New folder (4)/model/config_lora-20260112-160253.toml']' returned non-zero exit status 1.
16:03:13-800751 INFO     Training has ended.                                    
                                    

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions