Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: empty() missing 1 required positional arguments: "size" #36061

Open
4 tasks
Islander-0v0-wxin opened this issue Feb 6, 2025 · 1 comment
Open
4 tasks
Labels

Comments

@Islander-0v0-wxin
Copy link

System Info

transformer 4.48.2, pytorch 2.6, cuda2.4, python 3.10

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

model_path = "meta-llama/Meta-Llama-3.2-11B-Instruct"
tokenizer = AutoTokenizer.from_pretrained(model_path)

model = AutoModelForCausalLM.from_pretrained(
model_path,
torch_dtype=torch.float16,
device_map="auto"
)

Expected behavior

it suppose to patch the model and load the model


TypeError Traceback (most recent call last)
/tmp/ipykernel_4741/2989894542.py in
4 tokenizer = AutoTokenizer.from_pretrained(model_path)
5
----> 6 model = AutoModelForCausalLM.from_pretrained(
7 model_path,
8 torch_dtype=torch.float16,

~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
562 elif type(config) in cls._model_mapping.keys():
563 model_class = _get_model_class(config, cls._model_mapping)
--> 564 return model_class.from_pretrained(
565 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
566 )

~/.local/lib/python3.10/site-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, config, cache_dir, ignore_mismatched_sizes, force_download, local_files_only, token, revision, use_safetensors, weights_only, *model_args, **kwargs)
4243 offload_index,
4244 error_msgs,
-> 4245 ) = cls._load_pretrained_model(
4246 model,
4247 state_dict,

~/.local/lib/python3.10/site-packages/transformers/modeling_utils.py in _load_pretrained_model(cls, model, state_dict, loaded_keys, resolved_archive_file, pretrained_model_name_or_path, ignore_mismatched_sizes, sharded_metadata, _fast_init, low_cpu_mem_usage, device_map, offload_folder, offload_state_dict, dtype, hf_quantizer, keep_in_fp32_modules, gguf_path, weights_only)
4575
4576 if param.device == torch.device("meta"):
-> 4577 value = torch.empty(*param.size(), dtype=target_dtype)
4578 if (
4579 not is_quantized

TypeError: empty() missing 1 required positional arguments: "size"

@Rocketknight1
Copy link
Member

Hi @Islander-0v0-wxin, that code runs for me! I'm not sure exactly what's going wrong for you - can you try running on different environments and narrowing down the cause?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants