Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama endpoint seems not to work #402

Closed
Vasilije1990 opened this issue Dec 31, 2024 · 1 comment
Closed

Ollama endpoint seems not to work #402

Vasilije1990 opened this issue Dec 31, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@Vasilije1990
Copy link
Contributor

Hello Vasilije,

I just returned from my holiday, and decided to give cognee a go. I am interested to test the local model integration instead of with any third-party. As such, as per docs (https://docs.cognee.ai/core_concepts/local_models), I added the option to use the Ollama service hosted on the same dev machine:
cognee.config.llm_provider = 'ollama'

Using the same code on the same site, I get the following exception when I tried to run:


Running cognify to create knowledge graph...

InvalidValueError: LLM API key is not set. (Status code: 422)Coroutine task errored: extract_graph_from_data
('LLM API key is not set.', 'InvalidValueError')
Traceback (most recent call last):
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/modules/pipelines/operations/run_tasks.py", line 116, in run_tasks_base
task_result = await running_task.run(*args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/tasks/graph/extract_graph_from_data.py", line 19, in extract_graph_from_data
chunk_graphs = await asyncio.gather(
^^^^^^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/modules/data/extraction/knowledge_graph/extract_content_graph.py", line 7, in extract_content_graph
llm_client = get_llm_client()
^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/infrastructure/llm/get_llm_client.py", line 37, in get_llm_client
raise InvalidValueError(message="LLM API key is not set.")
cognee.exceptions.exceptions.InvalidValueError: ('LLM API key is not set.', 'InvalidValueError')Async generator task errored: extract_chunks_from_documents
('LLM API key is not set.', 'InvalidValueError')

Even though I am using a local LLM, it is still asking for LLM api key. When I simply added some random key, the client ends up reaching out to OpenAI, and fails.

I searched your Discord community and there was a post from some months ago with a similar issue as mine. But I don't see any specific suggestions on how to resolve the issue to use just Ollama without any OpenAI reliance.

If you have any suggestion, I will be keen to explore.

Thanks,
Kennedy

@Vasilije1990 Vasilije1990 added the bug Something isn't working label Dec 31, 2024
@Vasilije1990
Copy link
Contributor Author

The issue should be resolved now with the latest release

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant