Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple memory bank problem #29

Open
Masahiro1002 opened this issue Dec 26, 2023 · 4 comments
Open

Multiple memory bank problem #29

Masahiro1002 opened this issue Dec 26, 2023 · 4 comments

Comments

@Masahiro1002
Copy link

Thank you for sharing your code. I have a question because there is a discrepancy between the results of Table 1 and Table 2 and results of mine.

Regarding the results for 3D and RGB, I am getting results similar to those in the paper. However, for RGB+3D, the AUPRO score is significantly lower than the results in the paper. It seems that multiple memory bank doesn't work well. I suspect there might be an issue with the function add_sample_to_late_fusion_mem_bank calculates s_map, but I haven't been able to resolve it. Do you have any idea what the problem might be?

Point_MAE

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.945 0.648 0.971 0.983 0.832 0.766 0.87 0.919 0.869 0.824 0.863
PAUCROC 0.981 0.949 0.996 0.934 0.959 0.927 0.988 0.994 0.994 0.983 0.971
AUPRO 0.948 0.821 0.977 0.883 0.881 0.746 0.954 0.973 0.948 0.936 0.907

DINO

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.942 0.918 0.896 0.749 0.959 0.767 0.919 0.644 0.942 0.767 0.85
PAUCROC 0.992 0.993 0.994 0.977 0.983 0.955 0.994 0.99 0.995 0.994 0.987
AUPRO 0.951 0.972 0.973 0.891 0.932 0.843 0.97 0.956 0.968 0.966 0.942

Fusion

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.975 0.899 0.915 0.937 0.949 0.882 0.979 0.688 0.957 0.834 0.902
PAUCROC 0.994 0.99 0.997 0.985 0.986 0.972 0.995 0.994 0.997 0.995 0.99
AUPRO 0.963 0.965 0.978 0.942 0.944 0.889 0.974 0.967 0.973 0.973 0.957

DINO+Point_MAE+Fusion

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.988 0.903 0.97 0.971 0.941 0.925 0.964 0.909 0.971 0.86 0.94
PAUCROC 0.698 0.683 0.729 0.651 0.663 0.638 0.684 0.696 0.701 0.708 0.685
AUPRO 0.287 0.27 0.37 0.212 0.31 0.213 0.351 0.589 0.314 0.497 0.341
@lijing0901
Copy link

Thank you for sharing your code. I have a question because there is a discrepancy between the results of Table 1 and Table 2 and results of mine.

Regarding the results for 3D and RGB, I am getting results similar to those in the paper. However, for RGB+3D, the AUPRO score is significantly lower than the results in the paper. It seems that multiple memory bank doesn't work well. I suspect there might be an issue with the function add_sample_to_late_fusion_mem_bank calculates s_map, but I haven't been able to resolve it. Do you have any idea what the problem might be?

Point_MAE

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.945 0.648 0.971 0.983 0.832 0.766 0.87 0.919 0.869 0.824 0.863
PAUCROC 0.981 0.949 0.996 0.934 0.959 0.927 0.988 0.994 0.994 0.983 0.971
AUPRO 0.948 0.821 0.977 0.883 0.881 0.746 0.954 0.973 0.948 0.936 0.907
DINO

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.942 0.918 0.896 0.749 0.959 0.767 0.919 0.644 0.942 0.767 0.85
PAUCROC 0.992 0.993 0.994 0.977 0.983 0.955 0.994 0.99 0.995 0.994 0.987
AUPRO 0.951 0.972 0.973 0.891 0.932 0.843 0.97 0.956 0.968 0.966 0.942
Fusion

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.975 0.899 0.915 0.937 0.949 0.882 0.979 0.688 0.957 0.834 0.902
PAUCROC 0.994 0.99 0.997 0.985 0.986 0.972 0.995 0.994 0.997 0.995 0.99
AUPRO 0.963 0.965 0.978 0.942 0.944 0.889 0.974 0.967 0.973 0.973 0.957
DINO+Point_MAE+Fusion

Bagel Gland Carrot Cookie Dowel Foam Peach Potato Rope Tire Mean
IAUCROC 0.988 0.903 0.97 0.971 0.941 0.925 0.964 0.909 0.971 0.86 0.94
PAUCROC 0.698 0.683 0.729 0.651 0.663 0.638 0.684 0.696 0.701 0.708 0.685
AUPRO 0.287 0.27 0.37 0.212 0.31 0.213 0.351 0.589 0.314 0.497 0.341

Hi,friends!
I only tried the dino method, but the effect is much worse than yours, I would like to ask if you have made any improvements to the RGB-only method, or what is the device environment like?

@BanquetLee
Copy link

My results of DINO are similar to those in paper, but Point_MAE method didn't perform well on some classes of the mvtec3d

@VegetableChicken504
Copy link

Hi, I ran the results with only the I-AUROC metric not matching the results in the paper at 0.936 (0.945 in the paper), and saw that your I-AUROC result was 0.94, and would like to ask what is your graphics card configuration? The UFF training weights I used are the best training weights provided by the authors (it's the UFF Module under checkpoints)

@Masahiro1002
Copy link
Author

@VegetableChicken504
I used NVIDIA's V100 at that time. I think 0.09 points lower is possible due to random numbers and other conditions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants