Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于微调后保存的Adapter_model.safetensors模型大小的问题 #19

Open
Maribel-Hearn opened this issue Jan 4, 2025 · 0 comments

Comments

@Maribel-Hearn
Copy link

作者你好,我在尝试运行微调时,使用了LLaMA2-7B模型,当我跑完一轮后发现在checkpoint文件中保存的Adapter_model.safetensors以及optimizer.pt文件都十分巨大,Adapter_model.safetensors大概有2G左右,optimizer.pt也有1G,这种情况是正常的吗?或者应该怎么修复这种情况?
我的运行环境有:
transformers==4.46.0 peft==0.7.0 torch==2.1.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant