-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add support for continuous parameter ranges #40
Comments
Hi @SimonBlanke, I was also looking for continuous parameter ranges, perhaps you could look at the Gymnasium Spaces code for ideas for handling discrete/continuous (its one of the standard Reinforcement Learning toolkits). The |
Hello @logan-dunbar, sorry for this very late answer. I read you comment and looked into the link you provided, but answering you somehow fell of my radar. Using search_space = {
"x1": np.arange(-100, 101, 0.1),
"x2": np.arange(-100, 101, 0.1),
"x3": (-1, 1),
} The search_space = {
"x1": np.arange(-100, 101, 0.1),
"x2": np.arange(-100, 101, 0.1),
"x3": {
"low": -1,
"high": 1,
"additional parameter 1": ...,
"additional parameter 2": ...,
},
} In this case the naming improves the readability, because putting multiple values into a tuple gets confusing at some point.
How would this look like? I guess you mean the way the dimensions are accessed in the objective-function. The names for the dimensions are genertic in this example and look like they could just be indices of a vector. But if you want to do something like hyperparameter-optimization the search-space looks very different and the name for the dimensions helps for readability. |
Just wanted to chime in here. Perhaps what @logan-dunbar is asking for is multi-dimensional parameter declaration. Something like
For example, in the nevergrad package there is an interface for parameter arrays: see https://facebookresearch.github.io/nevergrad/parametrization.html The explicit API they expose is
Perhaps this is a different topic than the original post. If it is, then I can create it as a new feature request. It would be a great feature! Many models have arrays of parameters, and it can be a pain to fold/unfold everything. |
Hello @mxv001, thanks for your suggestion. I looked into the nevergrad-package. The interface you have shown is somewhat related to this issue, because it also enables continuous parameter ranges. But it is also a much broader topic, because of of the multi-dimensional parameter declaration. I would suggest, that you open another issue (feature request). I am not sure if a "nevergrad-style" of search-space creation will find its way into gradient-free-optimizers (I like to keep the API very simple), but I think it would be valuable to discuss it. |
In this issue I will show the progress of adding support for continuous parameter ranges in the search-space.
For most optimization algorithms it should be easy to add support for continuous parameter ranges:
So in conclusion: Adding support for continuous search-spaces should be possible with reasonable effort.
The next problem to discuss is how this will be integrated into the current API. It is important to me, that the API design stays simple and intuitive.
Also: It would be very interesting if the search-space can have discrete parameter ranges in some dimensions and continuous ones in other dimensions.
The current search-space looks something like this:
How would a continuous dimension look like? It cannot be a numpy array and it should be distinguishable enough from a discrete dimension. Maybe a tuple:
I will brainstorm some ideas and write some prototype code to get a clear vision for this feature.
The text was updated successfully, but these errors were encountered: