Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Dim selection error (maybe) when distilling #3

@jinyu121

Description

@jinyu121

Let's see https://github.com/kshmelkov/incremental_detectors/blob/master/network.py#L59-L87

Say self.num_classes = 15, and self.subnet.num_classes = 10, so we use logits = self.logits_for_distillation[:, :cached_classes+1] for distillation.

But wait, [:, :cached_classes+1]? It is 0-6 now, rather than 0-11. Should we use 0-11 here?

def compute_distillation_crossentropy_loss(self):
        cached_classes = self.subnet.num_classes
        ...

def compute_distillation_bbox_loss(self):
        cached_classes = self.subnet.num_classes
        ...

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions