-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integration of generic LLM API (e.g. Xinference) #24
Conversation
Thanks for the PR, @AndiMajore! @fengsh27, this is the local LLM application that we briefly talked about, I think it would be helpful for you to see the implementation, and helpful for the PR if you could give your opinion / input on the code. In particular, I would like to reduce redundancies in the newly established generic OpenAI connectivity and the one that previously existed. It would be great if the existing I will be back to being able to code next week, so I'm happy to get practically involved as well. @AndiMajore, could you summarise again what kind of conflicts we have between the OpenAI API and the custom Xinference one? Ideally, we would end up with a generic
Maybe we can come up with more concise names for the classes. ;) |
Hey @slobentanzer , |
…pr/AndiMajore/24
explicit (optional) dependency because without, poetry would take .. forever to resolve `xinference` dependencies
instead of "document summarisation"
otherwise default is always used and that prevents testing different .. embedders
mds (copied and adapted from biocypher repo)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Adds generic implementation of the Conversation class to allow for use of non-openai but openai style endpoints.
Includes the addition of customized openai.py > generic_openai.py