-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hugging face model #2
base: llm-pipeline
Are you sure you want to change the base?
Conversation
Signed-off-by: Bepitic <[email protected]>
Signed-off-by: Bepitic <[email protected]>
Signed-off-by: Bepitic <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this support. I've added few comments.
src/anomalib/models/image/hugging_face_wrapper/lightning_model.py
Outdated
Show resolved
Hide resolved
self.max_new_tokens = max_new_tokens | ||
self.pre_images: list[str] = [] | ||
|
||
self.model = LlavaForConditionalGeneration.from_pretrained( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this mean that models would only be Llava
-based ?
|
||
@staticmethod | ||
def configure_optimizers() -> None: | ||
"""WinCLIP doesn't require optimization, therefore returns no optimizers.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
WinCLIP?
return | ||
|
||
def validation_step(self, batch: dict[str, str | torch.Tensor], *args, **kwargs) -> dict: | ||
"""Validation Step of WinCLIP.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here, Huggingface or Llava?
Returns: | ||
ref_images (Tensor): A tensor containing the reference images. | ||
""" | ||
ref_images: list[str] = [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just for the consistency with the name of the method
ref_images: list[str] = [] | |
reference_images: list[str] = [] |
def _load_image(self, image_file: str) -> Image.Image: | ||
return Image.open(image_file).convert("RGB") | ||
|
||
def _api_call_zero_shot(self, image_path: str) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be possible for a user to configure the prompt if needed? Or does it have to be hardcoded?
Co-authored-by: Samet Akcay <[email protected]>
Co-authored-by: Samet Akcay <[email protected]>
__all__ = ["HuggingFaceWrapper"] | ||
|
||
|
||
class HuggingFaceWrapper(AnomalyModule): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this a Wrapper or a Model?
π Description
β¨ Changes
Select what type of change your PR is:
β Checklist
Before you submit your pull request, please make sure you have completed the following steps:
For more information about code review checklists, see the Code Review Checklist.