-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add functions to calculate Precision, Recall and F1score. #255
Conversation
Thanks for your contribution! 😄 It's looking good but I haven't tested it yet. A suggestion, could you return the results per class and global as it is done for accuracy or IoU? |
Yes, I'll do that and update soon. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code structure is fine, but there are some errors that need to be amended.
- Minor fixes in return statements type definitions.
Keep it up! We're almost there 😉
Thankyou! Is there anything more I can contribute to? I'll be glad to. |
Hi, sure thing! Some suggestions:
|
Sure, I'll test this out and update soon. Should it be a new issue though?
I think a couple of contributors have been assigned to this and their approaches are at par. If you think there's something more I can add there, then please let me know.
Sure, definitely! |
Not the testing itself, but feel free to open up any issue you find during the process 😄
For now let's wait to see if the people that have expressed their interest make their contribution. I'll let you know if we need help in that regard. Overall, I would prioritize testing the current tutorial available to get a better sense of whether it is usable as it is now and maybe we can work together in building new tutorials (#245 ). Once again, thanks for your interest! |
Add functions to calculate Precision, Recall and F1Score under the ConfusionMatrix class in
detectionmetrics/utils/metrics.py
.Addresses #240 Add new metrics.