-
-
Notifications
You must be signed in to change notification settings - Fork 154
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: Ollama source stream issue #587
Comments
The exact same thing happens to me @kwaroran plss RisuAI.mov |
Check if it worked before and the same problem occurs in version v123.0.0 |
so its not just me. #582 seems like a lot of the api are broken |
Same here with ollama. |
Can you please fix this ? |
The fix can be found at https://github.com/orsonteodoro/oiledmachine-overlay/blob/master/games-rpg/RisuAI/files/RisuAI-139.2.0-ollama-fix.patch There is another issue when I ran into the above issue that the AI may do an infinite loop and repeat the same phrases over and over and there is no word limit or time limit. |
When using Ollama (url
http://localhost:11434
), the response only shows the last streamed chunk/token/word.It looks like it's replacing the response message with the last streamed token/chunk instead of appending to the previously streamed chunks/tokens.
This is on latest master (commit
952ee5f
) checked out a few minutes ago.The text was updated successfully, but these errors were encountered: