-
Notifications
You must be signed in to change notification settings - Fork 574
feat(finetune): add warnings for descriptor config changes during fine-tuning with proper default handling #4925
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: devel
Are you sure you want to change the base?
Conversation
…e-tuning Co-authored-by: njzjz <[email protected]>
…r.nlayers Co-authored-by: njzjz <[email protected]>
nlayer lead no error report while fine-tuning
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## devel #4925 +/- ##
==========================================
+ Coverage 84.29% 85.20% +0.90%
==========================================
Files 703 705 +2
Lines 68728 75926 +7198
Branches 3573 3573
==========================================
+ Hits 57935 64693 +6758
- Misses 9653 10094 +441
+ Partials 1140 1139 -1 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
… all equally Co-authored-by: njzjz <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds a warning system to notify users when descriptor configuration parameters in input.json differ from those in the pretrained model during fine-tuning with the --use-pretrain-script flag. Currently, DeePMD-kit silently overwrites user configurations, which can be confusing when users attempt to modify parameters during fine-tuning.
- Implements comprehensive descriptor config comparison with detailed before/after value warnings
- Preserves existing fine-tuning behavior while adding transparency about configuration overwrites
- Handles nested configuration structures and treats all parameters equally
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| deepmd/pt/utils/finetune.py | Adds warning function and integrates it into PyTorch fine-tuning workflow |
| deepmd/pd/utils/finetune.py | Adds identical warning function and integrates it into Paddle fine-tuning workflow |
Comments suppressed due to low confidence (1)
deepmd/pt/utils/finetune.py:1
- The difference detection logic is duplicated across both PyTorch and Paddle backends. Consider extracting this to a shared utility module to avoid code duplication and ensure consistent behavior across backends.
# SPDX-License-Identifier: LGPL-3.0-or-later
deepmd/pt/utils/finetune.py
Outdated
| differences.append( | ||
| f" {key}: {input_descriptor[key]} -> {pretrained_descriptor[key]}" | ||
| ) |
Copilot
AI
Aug 29, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For complex nested dictionaries or large configuration objects, string representation in f-strings could be expensive and potentially produce very long log messages. Consider truncating or using a more efficient representation for complex objects.
|
It works for The issue pointed out that when the configurations are not the same and without |
@copilot |
… --use-pretrain-script Co-authored-by: njzjz <[email protected]>
Added warnings for descriptor configuration mismatches when NOT using
Implemented the additional check in |
|
Another issue: model params in |
@copilot |
…s during config comparison Co-authored-by: njzjz <[email protected]>
Fixed the issue with unnecessary warnings for default parameter values. The problem was that The solution normalizes both configurations before comparison using |
|
@copilot can you wrap up and simplify all the modifications in this PR? It seems that there are lots of duplicated lines. |
|
…red utilities Co-authored-by: njzjz <[email protected]>
Consolidated and simplified all modifications by eliminating code duplication across backends. The refactoring moved duplicate warning functions to shared utilities in |
Current Status
The PR now includes comprehensive fine-tuning warnings with significant code deduplication:
Consolidated Warning System
deepmd.utils.finetuneWarning Functions
warn_descriptor_config_differences()- For --use-pretrain-script scenarios where input config is overwrittenwarn_configuration_mismatch_during_finetune()- For scenarios without --use-pretrain-script where only compatible state dict parameters are loadedKey Features
Benefits of Refactoring
This change provides users with clear visibility into configuration changes during fine-tuning while maintaining a clean, maintainable codebase.
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.