Skip to content

Commit 18fbb74

Browse files
Arsh ZahedArsh Zahed
authored andcommitted
Remove the default LoRA rank warning
1 parent fa58e5b commit 18fbb74

File tree

1 file changed

+1
-4
lines changed

1 file changed

+1
-4
lines changed

src/together/cli/api/finetune.py

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -197,10 +197,7 @@ def create(
197197
"batch_size": model_limits.lora_training.max_batch_size,
198198
"learning_rate": 1e-3,
199199
}
200-
log_warn_once(
201-
f"The default LoRA rank for {model} has been changed to {default_values['lora_r']} as the max available.\n"
202-
f"Also, the default learning rate for LoRA fine-tuning has been changed to {default_values['learning_rate']}."
203-
)
200+
204201
for arg in default_values:
205202
arg_source = ctx.get_parameter_source("arg") # type: ignore[attr-defined]
206203
if arg_source == ParameterSource.DEFAULT:

0 commit comments

Comments
 (0)