Replies: 1 comment
-
check https://github.com/opendevin/opendevin instead, that one has litellm support |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
LiteLLM is pretty great project which helps us in using 100+ LLM easily with same structure in which we just need to feed base api url and model name and api key and start local proxy server which easily help us in using many so many llms easily. In description of project ollama is mentioned which is fine but those who dont wanna run locally but wanna use particular llm which has api but is not supported natively by this project cannot use llm until and unless you integrate litellm project with this one
Beta Was this translation helpful? Give feedback.
All reactions