-
Notifications
You must be signed in to change notification settings - Fork 149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow a specific m and u probabilities to be fixed during training #2379
Conversation
To do:
|
It's reassuring to know this is a common ask. Previously, I have resorted to manually setting some params on the |
@@ -169,7 +169,7 @@ def populate_m_u_from_lookup( | |||
) -> None: | |||
cl = comparison_level | |||
|
|||
if "m" not in training_fixed_probabilities: | |||
if not cl._fix_m_probability and "m" not in training_fixed_probabilities: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is needed because otherwise the cl.m_probability in the training linker is set, and so the value fluctuates during EM training despite not being assigned back to the m probability on the main linker
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great!
This PR allow the user to fix m and u probabilities when the model is created such that they aren't changed when training is run.
This is a fairly fairly common requirement e.g. here here and here. Because in some cases the user has prior knowledge of specific m and u values and wishes to fix them during training.
Suggested API:
Closes #2068