-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor lvms #1096
Refactor lvms #1096
Conversation
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
…nto refactor_lvm
for more information, see https://pre-commit.ci
…nto refactor_lvm
for more information, see https://pre-commit.ci
…nto refactor_lvm
…nto refactor_lvm
…/GenAIComps into source/refactor_lvm
|
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need resolve conflict
video-llama has a issue with latest transformers 4.48.0
|
@@ -0,0 +1,60 @@ | |||
# Copyright (C) 2024 Intel Corporation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Scripts rename and code structure update will be in another PR.
Co-authored-by: ZePan110 <[email protected]>
Description
Refactor lvm comp
Issues
opea-project/GenAIExamples#1333
Type of change
List the type of change like below. Please delete options that are not relevant.
Dependencies
na
Tests
UT