We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
在train_graphcast.py文件的train方法中,有下面两行代码。
param_groups = timm.optim.optim_factory.add_weight_decay(model, args.weight_decay) optimizer = torch.optim.AdamW(param_groups, lr=args.lr, betas=(0.9, 0.95))
我查看timm的文档,和timm.optim.optim_factory的源代码都没有找到add_weight_decay方法。代码中args.weight_decay参数是一个浮点数0.05。 请问这里的作用是不是等效于下面代码?因为我不确定第一行代码中传入model是否有其他效果?
timm
timm.optim.optim_factory
add_weight_decay
args.weight_decay
0.05
model
optimizer = torch.optim.AdamW(lr=args.lr, weight_decay=args.weight_decay, betas=(0.9, 0.95))
The text was updated successfully, but these errors were encountered:
No branches or pull requests
在train_graphcast.py文件的train方法中,有下面两行代码。
我查看
timm
的文档,和timm.optim.optim_factory
的源代码都没有找到add_weight_decay
方法。代码中args.weight_decay
参数是一个浮点数0.05
。请问这里的作用是不是等效于下面代码?因为我不确定第一行代码中传入
model
是否有其他效果?The text was updated successfully, but these errors were encountered: