Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A problem about calculating confusion matrices #13508

Open
1 of 2 tasks
SwustLiC opened this issue Feb 12, 2025 · 3 comments
Open
1 of 2 tasks

A problem about calculating confusion matrices #13508

SwustLiC opened this issue Feb 12, 2025 · 3 comments
Labels
bug Something isn't working detect Object Detection issues, PR's

Comments

@SwustLiC
Copy link

Search before asking

  • I have searched the YOLOv5 issues and found no similar bug report.

YOLOv5 Component

No response

Bug

in yolov5/utils/metics.py

class ConfusionMatrix:
"""Generates and visualizes a confusion matrix for evaluating object detection classification performance."""

def __init__(self, nc, conf=0.25, iou_thres=0.45):
    """Initializes ConfusionMatrix with given number of classes, confidence, and IoU threshold."""
    self.matrix = np.zeros((nc + 1, nc + 1))
    self.nc = nc  # number of classes
    self.conf = conf
    self.iou_thres = iou_thres

def process_batch(self, detections, labels):
    """
    Return intersection-over-union (Jaccard index) of boxes.

    Both sets of boxes are expected to be in (x1, y1, x2, y2) format.

    Arguments:
        detections (Array[N, 6]), x1, y1, x2, y2, conf, class
        labels (Array[M, 5]), class, x1, y1, x2, y2
    Returns:
        None, updates confusion matrix accordingly
    """
    if detections is None:
        gt_classes = labels.int()
        for gc in gt_classes:
            self.matrix[self.nc, gc] += 1  # background FN
        return

    detections = detections[detections[:, 4] > self.conf]
    gt_classes = labels[:, 0].int()
    detection_classes = detections[:, 5].int()
    iou = box_iou(labels[:, 1:], detections[:, :4])

    x = torch.where(iou > self.iou_thres)
    if x[0].shape[0]:
        matches = torch.cat((torch.stack(x, 1), iou[x[0], x[1]][:, None]), 1).cpu().numpy()
        if x[0].shape[0] > 1:
            matches = matches[matches[:, 2].argsort()[::-1]]
            matches = matches[np.unique(matches[:, 1], return_index=True)[1]]
            matches = matches[matches[:, 2].argsort()[::-1]]
            matches = matches[np.unique(matches[:, 0], return_index=True)[1]]
    else:
        matches = np.zeros((0, 3))

    n = matches.shape[0] > 0
    m0, m1, _ = matches.transpose().astype(int)
    for i, gc in enumerate(gt_classes):
        j = m0 == i
        if n and sum(j) == 1:
            self.matrix[detection_classes[m1[j]], gc] += 1  # correct
        else:
            self.matrix[self.nc, gc] += 1  # true background

    if n:
        for i, dc in enumerate(detection_classes):
            if not any(m1 == i):
                self.matrix[dc, self.nc] += 1  # predicted background

When the detection box does not match with gt, that is, when iou is 0, the detection box is not calculated into fp. Is this a special design or a bug. The confusion matrix in YOLOV11 is not quite the same

Environment

No response

Minimal Reproducible Example

No response

Additional

No response

Are you willing to submit a PR?

  • Yes I'd like to help by submitting a PR!
@SwustLiC SwustLiC added the bug Something isn't working label Feb 12, 2025
@glenn-jocher glenn-jocher added the detect Object Detection issues, PR's label Feb 12, 2025
@glenn-jocher
Copy link
Member

👋 Hello @SwustLiC, thank you for your interest in YOLOv5 🚀! Your detailed description regarding the confusion matrix behavior is much appreciated. To help us debug and respond effectively, could you kindly provide a minimum reproducible example (MRE)? This will allow us to replicate the issue on our side and better understand the behavior you've described.

For reference:

  • Please include any relevant code snippets, data (if possible), or steps to reproduce the behavior.
  • Highlight any dataset specifics or settings that may influence the result.

If you're curious about how to get started or need additional context, please visit our ⭐️ Tutorials, where you'll find helpful resources such as Hyperparameter Evolution and guides for Custom Data Training.

Requirements

Ensure that your environment meets the following:
Python>=3.8.0 with all requirements.txt dependencies installed, including PyTorch>=1.8. To set up:

git clone https://github.com/ultralytics/yolov5  # clone
cd yolov5
pip install -r requirements.txt  # install

Environments

YOLOv5 is verified in the following environments and could help to test for consistency:

CI Status

YOLOv5 CI
If this badge is green, all Continuous Integration (CI) tests are passing. CI tests validate training, validation, detection, export, and benchmarks across macOS, Windows, and Ubuntu daily and for every commit.

Please note that this is an automated response to assist you quickly 😊. An Ultralytics engineer will follow up with you soon to provide further guidance.

@SwustLiC
Copy link
Author

detections: tensor([[9.8200e+02, 9.5500e+02, 1.1070e+03, 1.0460e+03, 9.1230e-01, 3.0000e+00]])
labels: tensor([[ 3, 1195, 953, 1319, 1035]])
iou =0
now n=false and will not enter the following loop, so the predicted box will not be counted in the fp?
if n:
for i, dc in enumerate(detection_classes):
if not any(m1 == i):
self.matrix[dc, self.nc] += 1 # predicted background

@pderrenger
Copy link
Member

@SwustLiC the current logic intentionally counts unmatched predictions as background (FP) only when there are existing matches (n=True). Detections with no matches in the entire batch (n=False) are excluded from FP counts. This is a known design choice in YOLOv5's confusion matrix implementation to avoid overcounting in batch processing. For a strict per-detection FP count, you could modify the logic to check all detections against all labels regardless of batch matches. Would you like to submit a PR to propose an adjustment?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working detect Object Detection issues, PR's
Projects
None yet
Development

No branches or pull requests

3 participants