Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing evaluation scripts #5

Open
LucidStephen opened this issue Apr 25, 2024 · 3 comments
Open

Missing evaluation scripts #5

LucidStephen opened this issue Apr 25, 2024 · 3 comments

Comments

@LucidStephen
Copy link

Thank you for this solid and interesting work. However, I found that the clip_score evaluation script of the 100_arts style is not included in this repository. The description of the paper about this experiment may result in the two evaluation methods: apply one prompt or five prompts to each art style.

@Shilin-LU
Copy link
Owner

Hi, thanks for your interest.

The clip_score evaluation script is included in metrics/evaluate_clip_score.py, and you could input the prompt list prompts_csv/art_100_concepts.csv to evaluate the erasure of art style.

We apply 5 prompts to evaluate each art style, and for each prompt we use 5 seeds.

@LucidStephen
Copy link
Author

Thank you for your reply. I noticed that sample_images_from_csv.py generates two folders "0" and "1" because of the type attribute in art_100_concepts.csv. This may require some modifications to evaluate_clip_score to accommodate artistic style experiments, such as adding a new args: type.

@Shilin-LU
Copy link
Owner

Yes, you are right. Just small modifications work well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants