You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AttributeError: 'NoneType' object has no attribute 'attn_bias' when finetuning Llama 3.1 8B in Lightning.ai Studio
Issue:
I'm consistently encountering the following AttributeError when attempting to finetune the "unsloth/Meta-Llama-3.1-8B" model using Unsloth in Lightning.ai Studio:
The error occurs during the training loop, specifically within Unsloth's internal _unsloth_pre_compute_loss function, seemingly related to the model's attention mechanism.
Installation Attempts:
I've tried several Unsloth installation methods, including:
!pip install unsloth
# Also get the latest nightly Unsloth!!pip uninstall unsloth -y && pip install --upgrade --no-cache-dir --no-deps git+https://github.com/unslothai/unsloth.git
# Also get latest transformers!pip install --upgrade --no-cache-dir transformers
I even tried installing the latest nightly unsloth version with phi-4, but the issue still remains:
!pip install unsloth
# Also get the latest nightly Unsloth!!pip install --force-reinstall --no-cache-dir --no-deps git+https://github.com/unslothai/unsloth.git
I've also tried the following additional steps to address the issue:
Printing Model Configuration: I printed the model's configuration using print(model.config) to inspect its attributes and verify if attn_bias or a similar attribute (like attention_bias) exists.
Directly setting attn_bias: After instantiation , i directly set model.config.attn_bias = True which is default in llama model , but still the same error is encountered
AttributeError Traceback (most recent call last)
File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/unsloth/trainer.py", line 45, in unsloth_train
return trainer.train(*args, **kwargs)
File :157, in train(self, resume_from_checkpoint, trial, ignore_keys_for_eval, **kwargs)
File :382, in _fast_inner_training_loop(self, batch_size, args, resume_from_checkpoint, trial, ignore_keys_for_eval)
File :34, in _unsloth_training_step(self, model, inputs, num_items_in_batch)
File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/unsloth/models/_utils.py", line 1069, in _unsloth_pre_compute_loss
logger.warning_once(
1064 f"Unsloth: Not an error, but {name} does not accept num_items_in_batch.\n"
...
983 output_hidden_states = (
984 output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states
985 )
AttributeError: 'NoneType' object has no attribute 'attn_bias'
The text was updated successfully, but these errors were encountered:
AttributeError: 'NoneType' object has no attribute 'attn_bias' when finetuning Llama 3.1 8B in Lightning.ai Studio
Issue:
I'm consistently encountering the following
AttributeError
when attempting to finetune the "unsloth/Meta-Llama-3.1-8B" model using Unsloth in Lightning.ai Studio:The error occurs during the training loop, specifically within Unsloth's internal
_unsloth_pre_compute_loss
function, seemingly related to the model's attention mechanism.Installation Attempts:
I've tried several Unsloth installation methods, including:
I've also tried the following additional steps to address the issue:
Printing Model Configuration: I printed the model's configuration using
print(model.config)
to inspect its attributes and verify ifattn_bias
or a similar attribute (like attention_bias) exists.Directly setting attn_bias: After instantiation , i directly set
model.config.attn_bias = True
which is default in llama model , but still the same error is encounteredAttributeError Traceback (most recent call last)
File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/unsloth/trainer.py", line 45, in unsloth_train
return trainer.train(*args, **kwargs)
File :157, in train(self, resume_from_checkpoint, trial, ignore_keys_for_eval, **kwargs)
File :382, in _fast_inner_training_loop(self, batch_size, args, resume_from_checkpoint, trial, ignore_keys_for_eval)
File :34, in _unsloth_training_step(self, model, inputs, num_items_in_batch)
File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/unsloth/models/_utils.py", line 1069, in _unsloth_pre_compute_loss
logger.warning_once(
1064 f"Unsloth: Not an error, but {name} does not accept
num_items_in_batch
.\n"...
983 output_hidden_states = (
984 output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states
985 )
AttributeError: 'NoneType' object has no attribute 'attn_bias'
The text was updated successfully, but these errors were encountered: