diff --git a/comps/integrations/langchain/README.md b/comps/integrations/langchain/README.md index a1f58d309..34190295c 100644 --- a/comps/integrations/langchain/README.md +++ b/comps/integrations/langchain/README.md @@ -71,4 +71,4 @@ llm = OPEALLM( llm.invoke("The meaning of life is") ``` -Check out [Samples](/comps/integrations/langchain/samples/README.md) for more examples using the OPEA Langchain package. +Check out [Samples](./samples/README.md) for more examples using the OPEA Langchain package. diff --git a/comps/integrations/langchain/samples/README.md b/comps/integrations/langchain/samples/README.md index 084414a6c..f263eaa72 100644 --- a/comps/integrations/langchain/samples/README.md +++ b/comps/integrations/langchain/samples/README.md @@ -75,6 +75,6 @@ Start Jupyter Notebook: jupyter notebook ``` -Open the [`summarize.ipynb`](/comps/integrations/langchain/samples/summarize.ipynb) notebook and run the cells to execute the summarization example. +Open the [`summarize.ipynb`](./summarize.ipynb) notebook and run the cells to execute the summarization example. -Open the [`qa_streaming.ipynb`](/comps/integrations/langchain/samples/qa_streaming.ipynb) notebook and run the cells to execute the QA chatbot with retrieval example. +Open the [`qa_streaming.ipynb`](./qa_streaming.ipynb) notebook and run the cells to execute the QA chatbot with retrieval example.