Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"bee" able to change the model from llama3.1 by default for ollama #48

Closed
jjasghar opened this issue Nov 20, 2024 · 2 comments
Closed
Labels
question Further information is requested

Comments

@jjasghar
Copy link

I was attempting to use the granite3.0-8b-dense model, and it seems there is no way to put in a custom model. When I attempted use ollama the error came back with:

bee-api-1  | {"level":"error","time":"2024-11-20T19:46:00.849Z","hostname":"8d4ea71dff49","name":"bee-api","queueName":"runs","job":{"id":"run_673e3c77d8ebcc81d5182868","name":"runs","failedReason":"model 'llama3.1' not found","data":{"runId":"run_673e3c77d8ebcc81d5182868"}},"err":{"type":"ResponseError","message":"model 'llama3.1' not found","stack":"ResponseError: model 'llama3.1' not found\n    at checkOk (file:///app/node_modules/.pnpm/[email protected]/node_modules/ollama/dist/shared/ollama.133b951a.mjs:70:9)\n    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n    at async post (file:///app/node_modules/.pnpm/[email protected]/node_modules/ollama/dist/shared/ollama.133b951a.mjs:118:3)\n    at async Ollama.show (file:///app/node_modules/.pnpm/[email protected]/node_modules/ollama/dist/shared/ollama.133b951a.mjs:383:22)\n    at OllamaChatLLM.meta (/app/node_modules/.pnpm/[email protected]_@[email protected]_@[email protected]_@googleapi_ia5npp4g5tjlrpr3a7futx3n24/node_modules/bee-agent-framework/src/adapters/ollama/chat.ts:148:19)\n    at TokenMemory.add (/app/node_modules/.pnpm/[email protected]_@[email protected]_@[email protected]_@googleapi_ia5npp4g5tjlrpr3a7futx3n24/node_modules/bee-agent-framework/src/memory/tokenMemory.ts:85:20)\n    at TokenMemory.addMany (/app/node_modules/.pnpm/[email protected]_@[email protected]_@[email protected]_@googleapi_ia5npp4g5tjlrpr3a7futx3n24/node_modules/bee-agent-framework/src/memory/base.ts:43:7)\n    at async executeRun (file:///app/dist/runs/execution/execute.js:75:3)\n    at async file:///app/dist/runs/jobs/runs.queue.js:49:7\n    at Worker.processJob (/app/node_modules/.pnpm/[email protected]/node_modules/bullmq/src/classes/worker.ts:786:22)","error":"model 'llama3.1' not found","status_code":404,"name":"ResponseError"},"msg":"Job failed"}

I have llama3 in ollama but not 3.1 which also seems weird, that it's hard coded o a specific version of llama. This opens up the issue with not being able to point to any of my models on that ollama uses.

Or am I missing something?

@mmurad2 mmurad2 added the question Further information is requested label Nov 25, 2024
@mmurad2
Copy link
Member

mmurad2 commented Nov 25, 2024

@Tomas2D do you have a view on this?

@Tomas2D
Copy link
Contributor

Tomas2D commented Nov 26, 2024

It is mentioned in the README (https://github.com/i-am-bee/bee-stack?tab=readme-ov-file#custom-models)
Related issue: #36

TLDR: we would like to allow users to change the model within the UI

@Tomas2D Tomas2D closed this as completed Dec 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants