We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
作者你好,我在尝试运行微调时,使用了LLaMA2-7B模型,当我跑完一轮后发现在checkpoint文件中保存的Adapter_model.safetensors以及optimizer.pt文件都十分巨大,Adapter_model.safetensors大概有2G左右,optimizer.pt也有1G,这种情况是正常的吗?或者应该怎么修复这种情况? 我的运行环境有: transformers==4.46.0 peft==0.7.0 torch==2.1.2
transformers==4.46.0 peft==0.7.0 torch==2.1.2
The text was updated successfully, but these errors were encountered:
No branches or pull requests
作者你好,我在尝试运行微调时,使用了LLaMA2-7B模型,当我跑完一轮后发现在checkpoint文件中保存的Adapter_model.safetensors以及optimizer.pt文件都十分巨大,Adapter_model.safetensors大概有2G左右,optimizer.pt也有1G,这种情况是正常的吗?或者应该怎么修复这种情况?
我的运行环境有:
transformers==4.46.0 peft==0.7.0 torch==2.1.2
The text was updated successfully, but these errors were encountered: