-
Notifications
You must be signed in to change notification settings - Fork 446
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DETR XAI #4184
DETR XAI #4184
Conversation
…ng output with raw logits and saliency maps for better interpretability.
…dding postprocessing options for saliency maps and refining DETR model's explain mode implementation.
@eugene123tw thanks for the PR! Could you attach some explain results here for a reference? |
…ing a dedicated method for splitting and reshaping logits in explain mode.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As @sovrasov noted, DetClassProbabilityMap
has some normalization issues.
I mean it is quite sensitive to the particular model and particular dataset.
Galina tried to come up with some uniformly good solution, but this was not that successful.
The result seems not perfect, but probably acceptable. Assuming that the model is well trained.
For the future, I would propose to revisit DetClassProbabilityMap
. Basically, it is just visualization of the classification head output. Probably some modification or other methods might work better to explain object detection models.
Summary
Enhance XAI capabilities for RTDETR and D-Fine models by implementing support for saliency map generation and feature vector outputs, compatibility with both PyTorch and OpenVINO frameworks.
Person Saliency Map:
Car Saliency Map:
How to test
otx explain --work_dir otx-workspace --dump True --explain_config.postprocess True
Checklist
License
Feel free to contact the maintainers if that's a concern.