-
-
Notifications
You must be signed in to change notification settings - Fork 205
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FR: getParamsFromMethods - Could this be annotations? #47
Comments
I'm not sure annotations make sense, as some times the min/max values will be dependent upon the data. The RBF Kernel is actually a good example of that if you look at the code (there is a guessSigma method that returns a distribution to search over for the value of Sigma). I do like the idea of a tuning priority though. I think it will take some thought on how that should be integrated. Maybe just an extra parameter when auto-populating tunable values? |
I was over-eager with the min/max, but doesn't the RandomSearch need some sort of constraints? My high-level FR 0: the code in Add-on 1: If a class is declaring parameters as tunable, the docs often say "you should bother adjusting A and B, don't worry about C unless you have a very odd case" which is the motivator for the priority ranking in the annotation. If it already have the params declared w/ annotations, it feels like tagging a (default MEDIUM) priority to LOW or HIGH would be easy. Add-on 2: Sane boundaries also communicate knowledge AND could enhance the effectiveness of the RandomSearch. Should the abc value be from 0 to 1? 1 to 1000? 0 to 1 but really almost always 0 to 0.001? heckifIknow. You have to be doing something similar already, I was tracing getParam-getGuess-guessMethod-invoke but got a bit lost. Maybe one really can't guess without looking at the data? |
RandomSearch needs a distribution. Which may or may not just be uniform. The current framework always returns a distribution object by default. GridSearch then uses quantiles from that distribution.
Any thoughts on how to make it easier to search? Some of it is compositional though. Like any algorithm that takes the Kernel Trick will have different parameters depending on the kernel given.
See above, RandomSearch doesn't need boundaries - it needs a distribution. And the values can change depending on the data. I try to put what the value range is in the documentation. I think if you are going to set them yourself, you should be reading up on the algorithm. Otherwise just trust the auto-fill defaults.
Depends on the algorithm :-/
That was actually an implementation detail because earlier versions didn't allow 0. Has nothing to do with what parameters you should try. In fact, for RF you should never make that value larger. |
Understood. Hummm... maybe it is as easy as two functions, or examples on how to emulate two functions, the facetiously named
Where #1 makes it easy to search the code, and #2 is the current "you don't know until you get there" reality. |
Hmm. Are you more interested in finding the tunable parameters themselves, or just the algorithms that have some? Could be just a annotation with no code meaning. "@Tunable". Just lets you know that the object has parameters to tune. |
That would be excellent. It is what I kinda assumed "implements Parameterized" indicated, but now I'm thinking I was reading too much into it. |
Do you have a usecase where you wanted/need these annotations at runtime, or purely to make it easier to search through the docs? |
I wanted them at runtime to see if it was worth passing the algo through a RandomSearch to try to improve it. |
This seems perfect for annotations:
The text was updated successfully, but these errors were encountered: