Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

checks for optimizing the wrong metric type with postptocessing #938

Open
topepo opened this issue Sep 17, 2024 · 0 comments
Open

checks for optimizing the wrong metric type with postptocessing #938

topepo opened this issue Sep 17, 2024 · 0 comments
Labels
feature a feature request or enhancement

Comments

@topepo
Copy link
Member

topepo commented Sep 17, 2024

If a post process parameter is marked for tuning, we should check what outcomes it changes and then cross-check with the metrics being measured.

For example, if someone is optimizing the probability cutoff for a binary classification model and they are only measuring ROC AUC (or some other probability-based metric), we should issue a warning since changing the cutoff will not change the metric.

@topepo topepo added the feature a feature request or enhancement label Sep 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature a feature request or enhancement
Projects
None yet
Development

No branches or pull requests

1 participant