-
-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ideas for v4.0 #12
Comments
Hi there, I am not sure if it has already been implemented, but it would be nice to have an easy command to use relative convergence criteria - rather than absolute ones. |
As far as I know, it can be difficult to implement a viable relative converge criteria into metaheuristics (due to stochastic operations of metaheuristics). It would be nice for me to know some relative convergence criteria designed for metaheuristics... Maybe, you can use the Example (Metaheuristics v3.2.6): julia> using Metaheuristics
julia> f, bounds, _ = Metaheuristics.TestProblems.sphere();
julia> options = Options(f_tol=1e-8,seed=1); # give tolerance
julia> optimize(f, bounds, ECA(;options))
+=========== RESULT ==========+
iteration: 228
minimum: 2.72864e-09
minimizer: [-1.404483151684814e-5, -1.645523586390722e-5, 1.9944474758572303e-5, -8.336438266970471e-6, 1.3988146389886503e-5, -1.3177919277964173e-5, -3.0124814894731768e-6, 1.6324625110249603e-5, -1.8097438702973058e-5, -2.865158742490321e-5]
f calls: 15947
total time: 0.1724 s
stop reason: Small difference of objective function values.
+============================+ |
I have to admit that I haven't explored the full potential of the package, so there may be cases where my suggestion does not apply. I am thinking mostly about cases in which the scale of the parameters is not uniform and you may want to have a relative convergence criterion such as |
I got it. Could you share some bibliographic references on those relative convergence criteria (for metaheuristics)? |
Will do. For now take a look at this link for a few examples of relative convergence criteria https://www.sfu.ca/sasdoc/sashtml/iml/chap11/sect11.htm. |
Hi, within the performance metrics, it would be interesting to add the multiplicative unitary epsilon. How difficult would it be to add it? |
@mtzrene I didn't know such performance indicator. Could you provide more info about it? A possible integration within Metaheuristics can be carried out if a paper and/or author's code (Julia or not) are available. |
@jmejia8 Of course, it is a performance criterion for multi-objective optimization. I looked for some code in another language to send you a reference, but was unsuccessful. However, it is mentioned in this article. https://ieeexplore.ieee.org/abstract/document/1197687 |
Hi Mejias, great project! Congratulations! It would be nice in future implementations to have a variant of "optimize" that performs the optimization process with restarts. This naive strategy could be very useful in multimodal problems and in those where we need to perform continuous optimization process (e.g. dynamic problems). It could be a new function (e.g. "optimize_with_restart") or some parameter inside "Options". Another issue that I think would help a lot is to have a overloaded "copy" for State, because it is complex to transfer data between instances of that type, that is, because of the amount of parameters that State contains. |
@pnovoa Thank you for your comments and suggestions. Could you share with me some bibliographic references on approaches using resstarts? Perhaps, a restarting procedure could be implemented as a wrapper: method = Restart(GA(), every = 20 #= generations =#)
optimize(f, bounds, method) Suggestions are welcome 😀 Regarding the second issue, what would you expect the overloaded |
Some ideas for v4
Implement Flux-inspired algorithms
Example:
The text was updated successfully, but these errors were encountered: