You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just returned from my holiday, and decided to give cognee a go. I am interested to test the local model integration instead of with any third-party. As such, as per docs (https://docs.cognee.ai/core_concepts/local_models), I added the option to use the Ollama service hosted on the same dev machine:
cognee.config.llm_provider = 'ollama'
Using the same code on the same site, I get the following exception when I tried to run:
Running cognify to create knowledge graph...
InvalidValueError: LLM API key is not set. (Status code: 422)Coroutine task errored: extract_graph_from_data
('LLM API key is not set.', 'InvalidValueError')
Traceback (most recent call last):
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/modules/pipelines/operations/run_tasks.py", line 116, in run_tasks_base
task_result = await running_task.run(*args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/tasks/graph/extract_graph_from_data.py", line 19, in extract_graph_from_data
chunk_graphs = await asyncio.gather(
^^^^^^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/modules/data/extraction/knowledge_graph/extract_content_graph.py", line 7, in extract_content_graph
llm_client = get_llm_client()
^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/infrastructure/llm/get_llm_client.py", line 37, in get_llm_client
raise InvalidValueError(message="LLM API key is not set.")
cognee.exceptions.exceptions.InvalidValueError: ('LLM API key is not set.', 'InvalidValueError')Async generator task errored: extract_chunks_from_documents
('LLM API key is not set.', 'InvalidValueError')
Even though I am using a local LLM, it is still asking for LLM api key. When I simply added some random key, the client ends up reaching out to OpenAI, and fails.
I searched your Discord community and there was a post from some months ago with a similar issue as mine. But I don't see any specific suggestions on how to resolve the issue to use just Ollama without any OpenAI reliance.
If you have any suggestion, I will be keen to explore.
Thanks,
Kennedy
The text was updated successfully, but these errors were encountered:
Hello Vasilije,
I just returned from my holiday, and decided to give cognee a go. I am interested to test the local model integration instead of with any third-party. As such, as per docs (https://docs.cognee.ai/core_concepts/local_models), I added the option to use the Ollama service hosted on the same dev machine:
cognee.config.llm_provider = 'ollama'
Using the same code on the same site, I get the following exception when I tried to run:
Running cognify to create knowledge graph...
InvalidValueError: LLM API key is not set. (Status code: 422)Coroutine task errored:
extract_graph_from_data
('LLM API key is not set.', 'InvalidValueError')
Traceback (most recent call last):
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/modules/pipelines/operations/run_tasks.py", line 116, in run_tasks_base
task_result = await running_task.run(*args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/tasks/graph/extract_graph_from_data.py", line 19, in extract_graph_from_data
chunk_graphs = await asyncio.gather(
^^^^^^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/modules/data/extraction/knowledge_graph/extract_content_graph.py", line 7, in extract_content_graph
llm_client = get_llm_client()
^^^^^^^^^^^^^^^^
File "/Users/myacc/codes/python/cognee/lib/python3.11/site-packages/cognee/infrastructure/llm/get_llm_client.py", line 37, in get_llm_client
raise InvalidValueError(message="LLM API key is not set.")
cognee.exceptions.exceptions.InvalidValueError: ('LLM API key is not set.', 'InvalidValueError')Async generator task errored:
extract_chunks_from_documents
('LLM API key is not set.', 'InvalidValueError')
Even though I am using a local LLM, it is still asking for LLM api key. When I simply added some random key, the client ends up reaching out to OpenAI, and fails.
I searched your Discord community and there was a post from some months ago with a similar issue as mine. But I don't see any specific suggestions on how to resolve the issue to use just Ollama without any OpenAI reliance.
If you have any suggestion, I will be keen to explore.
Thanks,
Kennedy
The text was updated successfully, but these errors were encountered: