Replies: 2 comments
-
Hi @sashabal, Nice to meet you! This feature is not part of the product roadmap, but I will do everything possible to work on it in the coming months. If you need it immediately you could use LLAMA locally, or implement this new feature yourself, I'm happy to guide you through the beelzebub code base 😄 Cheers Mario |
Beta Was this translation helpful? Give feedback.
0 replies
-
Ok, Thank you |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is your feature request related to a problem? Please describe.
There are services with LLM models that provide a starting deposit for testing, for example together.ai
Describe the solution you'd like
If possible, please implement integration with together.ai LLM(meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo)
Additional context
Example request in Python:
Example curl request:
Beta Was this translation helpful? Give feedback.
All reactions