Skip to content

Commit

Permalink
Fixed embedding_extract errors
Browse files Browse the repository at this point in the history
  • Loading branch information
SWHL committed Aug 29, 2023
1 parent ef449e9 commit 0055867
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 6 deletions.
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,9 @@
<details>
<summary>Click to expand</summary>

- 2023-08-29 v0.0.8 update:
- Fixed missing `embedding_extract`
- Fixed default parameters of LLM
- 2023-08-11 v0.0.7 update:
- Optimize layout, remove the plugin option, and put the extract vector model option on the home page.
- The tips are translated into English for easy communication.
Expand Down
3 changes: 3 additions & 0 deletions docs/README_zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,9 @@
```

#### 更新日志
- 2023-08-29 v0.0.8 update:
- 修复找不到`embedding_extract`
- 修复LLM的默认参数问题
- 2023-08-11 v0.0.7 update:
- 优化布局,去掉插件选项,将提取向量模型选项放到主页部分
- 将提示语英语化,便于交流使用。
Expand Down
2 changes: 1 addition & 1 deletion knowledge_qa_llm/config.yaml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
title: 🧐 Knowledge QA LLM
version: 0.0.7
version: 0.0.8

LLM_API:
Qwen7B_Chat: your_api
Expand Down
10 changes: 5 additions & 5 deletions webui.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def init_sidebar():
"top_p",
min_value=param_top.get("min_value"),
max_value=param_top.get("max_value"),
value=param_top.get("value"),
value=param_top.get("default"),
step=param_top.get("step"),
help=param_top.get("tip"),
)
Expand All @@ -55,7 +55,7 @@ def init_sidebar():
"temperature",
min_value=param_temp.get("min_value"),
max_value=param_temp.get("max_value"),
value=param_temp.get("value"),
value=param_temp.get("default"),
step=param_temp.get("stemp"),
help=param_temp.get("tip"),
)
Expand Down Expand Up @@ -213,9 +213,6 @@ def tips(txt: str, wait_time: int = 2, icon: str = "🎉"):
db_path = config.get("vector_db_path")
db_tools = DBUtils(db_path)

init_sidebar()
init_state()

llm_module = importlib.import_module("knowledge_qa_llm.llm")
MODEL_OPTIONS = {
name: getattr(llm_module, name)(api)
Expand Down Expand Up @@ -243,6 +240,9 @@ def tips(txt: str, wait_time: int = 2, icon: str = "🎉"):

embedding_extract = init_encoder(ENCODER_OPTIONS[select_encoder])

init_sidebar()
init_state()

input_prompt_container = st.container()
with input_prompt_container:
with st.expander("💡Prompt", expanded=False):
Expand Down

0 comments on commit 0055867

Please sign in to comment.