You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm just curious if you would be interested in a pull-request to make omega work with models on github marketplace (gpt-4o, llama3.1 405B, Phi 3.5, Mistral, Command R+, ...). These can be used for free, but are quite limited. Most models allow 8k input and 4k output tokens. Does this make sense or is it anyway a too small context window size?
If you like the idea, I would modify this code to make it work in Omega and send a PR.
Cheers,
Robert
The text was updated successfully, but these errors were encountered:
Hi @royerloic ,
I'm just curious if you would be interested in a pull-request to make omega work with models on github marketplace (gpt-4o, llama3.1 405B, Phi 3.5, Mistral, Command R+, ...). These can be used for free, but are quite limited. Most models allow 8k input and 4k output tokens. Does this make sense or is it anyway a too small context window size?
If you like the idea, I would modify this code to make it work in Omega and send a PR.
Cheers,
Robert
The text was updated successfully, but these errors were encountered: