diff --git a/CONTRIBUTING.md b/.github/CONTRIBUTING.md similarity index 100% rename from CONTRIBUTING.md rename to .github/CONTRIBUTING.md diff --git a/.github/workflows/docker-ci.yaml b/.github/workflows/docker-ci.yaml.hint similarity index 100% rename from .github/workflows/docker-ci.yaml rename to .github/workflows/docker-ci.yaml.hint diff --git a/README.md b/README.md index 47bfb4aa1..93d10a523 100644 --- a/README.md +++ b/README.md @@ -32,7 +32,7 @@ & ๐ English Readme & - ๐ค ่ดก็ฎๅฟ ็ + ๐ค ่ดก็ฎๅฟ ็
diff --git a/README_EN.md b/README_EN.md deleted file mode 100644 index a4611e6e3..000000000 --- a/README_EN.md +++ /dev/null @@ -1,221 +0,0 @@ -![cover](https://raw.githubusercontent.com/LlmKira/.github/main/llmbot/project_cover.png) - ------------------------ - -
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- ๐ฉ Deploy Docs - & - ๐ง Dev Docs - & - ๐ ไธญๆReadme - & - ๐ค How to contribute -
- -> Any issues with deployment? Submit an Issue to help us fix SLA - -The project is Open source reproduction attempt for [ChatGpt](https://chatgpt.com) like, with `FunctionCall` -and `ToolCall` as the core, -supporting multiple messaging platforms. - -Using message queue, it can easily handle function requests and support complex plug-in and functional design. Well -supported access to file. - -Supports multiple model sources and cross-platform message forwarding. - -| Demo | -|-----------------------------------| -| ![sticker](./docs/chain_chat.gif) | - -Unlike previous projects, this project tries to replicate ChatGpt's plugin system based on the messaging platform, -implementing some or more features. - -> Because func call is a feature, it only supports Openai type api, and does not intend to support LLM without func -> call. - -## ๐ฆ Feature - -- ๐ช Complete plug-in development ecosystem, using classic design, can be used after installation through `pip` -- ๐ Messaging system, no limit on time, no limit on sender, define sender and receiver, completely decoupled logic -- ๐ Route messages, customize message routing, and use routing to determine how to operate. -- ๐ฌ Public open quota/private self-configured backend/proxy token authentication, providing flexible and scalable - authentication development solutions -- ๐พ Support middleware interception development, and develop extensions to operate data before and after the process -- ๐ต Refined statistics system, easy to count usage -- ๐ฐ Support plug-in human-in-the-loop verification, authentication, and plug-in blacklist can be set -- ๐ฆ Improve standard file interaction support, upload/download files -- ๐ Supports individual configuration of environment keys and provides personal private environment variables for - plug-ins -- ๐ Supports incremental support for large language models, supports multi-platform expansion, and can be adapted by - inheriting standard classes -- ๐ Supports both `FunctionCall` and `ToolCall` features to dynamically build the required function classes based on the - model - -### ๐ง Introduction to authentication system - -The authentication system we use is called `Service Provider`, which is the service provider. Its role is to assign -Endpoint/Key/Model to each sender for authentication. -Has a `token` as a bound OpenKey. The program will call the set `Service Provider` to read the private Key/configuration -Token to obtain the authentication information. - -![auth](./docs/SeriveProvider.svg) - -Both the authentication component and the backend need to be implemented by yourself. - -### ๐ง Preview of some plugins - -| Sticker Converter | Timer Func | Translate | -|-------------------------------------|---------------------------------|----------------------------------------------| -| ![sticker](./docs/sticker_func.gif) | ![timer](./docs/timer_func.gif) | ![translate](./docs/translate_file_func.gif) | - -### ๐ฌ Platform support - -| Platform | Support | File System | Tip | -|----------|---------|-------------|----------------------------| -| Telegram | โ | โ | | -| Discord | โ | โ | | -| Kook | โ | โ | No Support `Replies start` | -| Slack | โ | โ | No Support `Replies start` | -| QQ | โ | | | -| Wechat | โ | | | -| Twitter | โ | | | -| Matrix | โ | | | -| IRC | โ | | | -| ... | | | Create issue/pr | - -## ๐ฆ Quick Start - -Read the [๐ง Deployment Documentation](https://llmkira.github.io/Docs/) for more information. - -Please use `pdm run python3 start_sender.py` `pdm run python3 start_receiver.py` to test whether it can run normally. - -#### Performance indicator test (Until 2023/11/1) - -Attention, does not include the memory usage of services such as pm2, redis, rabbitmq, mongodb, docker, etc. - -| Process | Memory Max Head Size | Tester | Client | -|------------|----------------------|--------------------------------------------------|----------| -| `receiver` | 120.847MB | `python3 -m memray run --live start_receiver.py` | telegram | -| `sender` | 83.669MB | `python3 -m memray run --live start_sender.py` | telegram | - -### ๐ฅฃ Docker - -Build Hub: [sudoskys/llmbot](https://hub.docker.com/repository/docker/sudoskys/llmbot/general) - -#### Automatic Docker/Docker-compose installation - -If you are using a brand new server, you can use the following shell to try to automatically install this project. - -This script will automatically install the required services and map ports using Docker methods, if you have -deployed `redis`, `rabbitmq`, `mongodb`. - -Please modify the `docker-compose.yml` file yourself. - -```shell - -curl -sSL https://raw.githubusercontent.com/LLMKira/Openaibot/main/deploy.sh | bash -``` - -#### Manual Docker-compose installation - -```shell -git clone https://github.com/LlmKira/Openaibot.git -cd Openaibot -pip install pdm && pdm install -G bot -docker-compose -f docker-compose.yml up -d - -``` - -Update the image with `docker-compose pull`. - -View Shell in docker, use `docker exec -it llmbot /bin/bash`, type `exit` to exit. - -### ๐ Shell - -To manually start using Pm2, you need to install `redis`, `rabbitmq`, `mongodb` by yourself. - -```shell -git clone https://github.com/LlmKira/Openaibot.git -cd Openaibot -pip install pdm && pdm install -G bot -apt install npm -y && npm install pm2 && pm2 start pm2.json -pm2 monit - -``` - -Use `pm2 restart pm2.json` to restart the program. - -> Recommend using `pdm` for dependency management, because we use `pydantic^1.9.0`, in order to prevent version -> conflicts, we use `pdm` for dependency management. - -## ๐ช Slash Command - -```shell -clear - erase chat history -help - show docs -chat - chat -task - chat with function_enable -ask - chat with function_disable -tool - list all functions -set_endpoint - set private key and endpoint -clear_endpoint - erase private key and endpoint -auth - auth a function -env - env for function -token - bind token -token_clear - clear token binding -func_ban - ban a function -func_unban - unban a function -bind - Bind rss platforms -unbind - Unbind rss platforms -``` - -## ๐ป How to develop? - -For plugin development, please refer to the sample plugins in the `plugins` directory. - -Plugin development please refer to [๐ง Plugin Dev Docs](https://llmkira.github.io/Docs/en/dev/basic) - -## ๐ค We need your help! - -This is a long term project and we started the development of the LLM APP very early! - -We applied a plugin-like system and search online before GPT3 OpenaiApi was released(davinci-003) - -After many iterations, we have worked hard to make this project more standardized, generic, and open. - -We can't do it on our own at the moment: - -- [ ] We need help with the documentation -- [ ] Web UI - -Feel free to submit a Pull Request or discuss, we'd love to receive your contribution! - -## ๐ Agreement - -> This project has no relationship with OPENAI/ChatGpt. - - -[![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2Fsudoskys%2FOpenaibot.svg?type=small)](https://app.fossa.com/projects/git%2Bgithub.com%2Fsudoskys%2FOpenaibot?ref=badge_small) diff --git a/llmkira/extra/user/schema.py b/llmkira/extra/user/schema.py index dc2174bbe..606f3684e 100644 --- a/llmkira/extra/user/schema.py +++ b/llmkira/extra/user/schema.py @@ -11,7 +11,7 @@ from pydantic import field_validator, ConfigDict, BaseModel, Field from pydantic_settings import BaseSettings, SettingsConfigDict -from ...sdk.endpoint import Driver +from llmkira.sdk.endpoint.tee import Driver class UserDriverMode(Enum): @@ -74,7 +74,7 @@ def create_from_function( return cls( request_id=request_id, uid=uid, - cost=cls.Cost.by_function( + cost=Cost.by_function( function_name=cost_by, token_usage=token_usage, token_uuid=token_uuid, diff --git a/llmkira/middleware/llm_provider.py b/llmkira/middleware/llm_provider.py index e4f25bece..e6ba93848 100644 --- a/llmkira/middleware/llm_provider.py +++ b/llmkira/middleware/llm_provider.py @@ -5,9 +5,10 @@ # @Software: PyCharm from loguru import logger -from llmkira.extra.user import UserControl, UserConfig, UserDriverMode +from llmkira.extra.user import UserControl, UserConfig from llmkira.middleware.service_provider.schema import ProviderSettingObj from .service_provider import loaded_provider, PublicProvider +from ..extra.user.schema import UserDriverMode if not loaded_provider: raise Exception("โ ๏ธ No Any Driver Provider Loaded, Even Public Provider") @@ -34,27 +35,33 @@ async def get(self): :return: Driver """ self.user = await UserControl.get_driver_config(uid=self.uid) - assert isinstance(self.user, UserConfig.LlmConfig), "UserConfig.LlmConfig is empty" + assert isinstance( + self.user, UserConfig.LlmConfig + ), "UserConfig.LlmConfig is empty" # ้ ็ฝฎไบ่ชๅทฑ็็งๆไพ if self.user.mode == UserDriverMode.private: return self.user.driver # Public Provider if ProviderSettingObj.is_open_everyone: provider = PublicProvider() - logger.debug(f"๐ฆ Public Provider --name ({provider.name}) --mode ({self.user.mode}) --uid ({self.uid})") + logger.debug( + f"๐ฆ Public Provider --name ({provider.name}) --mode ({self.user.mode}) --uid ({self.uid})" + ) if await provider.authenticate( - uid=self.uid, - token=self.user.token, status=self.user.mode): - return await provider.request_driver(uid=self.uid, token=self.user.token) + uid=self.uid, token=self.user.token, status=self.user.mode + ): + return await provider.request_driver( + uid=self.uid, token=self.user.token + ) else: # ็จๆท้่ฆ็นๅซ้ ็ฝฎ Token provider = loaded_provider() if await provider.authenticate( - uid=self.uid, - token=self.user.token, - status=self.user.mode + uid=self.uid, token=self.user.token, status=self.user.mode ): - return await provider.request_driver(uid=self.uid, token=self.user.token) + return await provider.request_driver( + uid=self.uid, token=self.user.token + ) """ raise ProviderException( f"AuthChanged {self.user.provider} >>change>> {loaded_provider.name.upper()}" diff --git a/llmkira/middleware/llm_task.py b/llmkira/middleware/llm_task.py index 8588ede87..87367dca8 100644 --- a/llmkira/middleware/llm_task.py +++ b/llmkira/middleware/llm_task.py @@ -14,9 +14,9 @@ from ..extra.user.schema import Cost from ..schema import RawMessage, Scraper from ..sdk.adapter import SCHEMA_GROUP, SingleModel -from ..sdk.endpoint import Driver from ..sdk.endpoint.openai import Message, Openai, OpenaiResult from ..sdk.endpoint.schema import LlmRequest, LlmResult +from ..sdk.endpoint.tee import Driver from ..sdk.memory.redis import RedisChatMessageHistory from ..sdk.schema import ToolCallCompletion, SystemMessage, Function, Tool from ..sdk.utils import sync diff --git a/llmkira/middleware/service_provider/private.py b/llmkira/middleware/service_provider/private.py index c00e25fcc..409766a7a 100644 --- a/llmkira/middleware/service_provider/private.py +++ b/llmkira/middleware/service_provider/private.py @@ -11,7 +11,7 @@ from config import provider_settings from . import resign_provider from .schema import BaseProvider, ProviderException -from ...sdk.endpoint import Driver +from ...sdk.endpoint.tee import Driver WHITE_LIST = [] if provider_settings.get("private", default=None) is not None: @@ -40,13 +40,13 @@ async def authenticate(self, uid, token, status) -> bool: if not Driver.from_public_env().available: raise ProviderException( "\nYou are using a public and free instance.\nThe current instance key is not configured.", - provider=self.name + provider=self.name, ) raise ProviderException( "This is a private instance." "\nPlease contact the administrator to apply for a private instance." f"\n You id is {uid}", - provider=self.name + provider=self.name, ) async def request_driver(self, uid, token) -> Driver: diff --git a/llmkira/middleware/service_provider/public.py b/llmkira/middleware/service_provider/public.py index 42186206a..79983f14f 100644 --- a/llmkira/middleware/service_provider/public.py +++ b/llmkira/middleware/service_provider/public.py @@ -12,14 +12,16 @@ from ...sdk.cache import global_cache_runtime from . import resign_provider from .schema import BaseProvider, ProviderException -from ...sdk.endpoint import Driver +from ...sdk.endpoint.tee import Driver QUOTA = 24 WHITE_LIST = [] if provider_settings.get("public", default=None) is not None: QUOTA = provider_settings.public.get("public_quota", default=24) WHITE_LIST = provider_settings.public.get("public_white_list", default=[]) - logger.debug(f"๐ฆ Public Provider Config Loaded, QUOTA({QUOTA}) WHITE_LIST({WHITE_LIST})") + logger.debug( + f"๐ฆ Public Provider Config Loaded, QUOTA({QUOTA}) WHITE_LIST({WHITE_LIST})" + ) class UserToday(BaseModel): @@ -42,12 +44,12 @@ async def authenticate(self, uid, token, status) -> bool: if not _pass: raise ProviderException( "You are using a public instance. You triggered data flood protection today", - provider=self.name + provider=self.name, ) if not Driver.from_public_env().available: raise ProviderException( "You are using a public instance\nBut current instance apikey unavailable", - provider=self.name + provider=self.name, ) return True @@ -61,14 +63,18 @@ async def check_times(self, times: int, uid: str): if read: _data: UserToday = UserToday.model_validate(read) if str(_data.time) != str(date): - await cache.set_data(self.__database_key(uid=uid), value=UserToday().model_dump()) + await cache.set_data( + self.__database_key(uid=uid), value=UserToday().model_dump() + ) return True else: if _data.count > times: return False if _data.count < times: _data.count += 1 - await cache.set_data(self.__database_key(uid=uid), value=_data.model_dump()) + await cache.set_data( + self.__database_key(uid=uid), value=_data.model_dump() + ) return True else: _data = UserToday() diff --git a/llmkira/middleware/service_provider/schema.py b/llmkira/middleware/service_provider/schema.py index c3e795422..e0085efb0 100644 --- a/llmkira/middleware/service_provider/schema.py +++ b/llmkira/middleware/service_provider/schema.py @@ -7,9 +7,10 @@ from pydantic import field_validator, Field -from llmkira.sdk.endpoint import Driver from pydantic_settings import BaseSettings +from llmkira.sdk.endpoint.tee import Driver + class ProviderSetting(BaseSettings): provider: str = Field("PUBLIC", validation_alias="SERVICE_PROVIDER") @@ -27,7 +28,6 @@ def provider_upper(cls, v): class ProviderException(Exception): - def __init__(self, message: str, provider: str = None): self.message = message self.provider = provider @@ -57,7 +57,9 @@ async def authenticate(self, uid, token, status) -> bool: """ ๅฟ ้กปๆไพ่ฎค่ฏๆๆกฃ """ - raise ProviderException("Base Provider auth your token,refer docs", provider=self.name) + raise ProviderException( + "Base Provider auth your token,refer docs", provider=self.name + ) @abstractmethod async def request_driver(self, uid, token) -> Driver: diff --git a/qodana.yaml b/qodana.yaml deleted file mode 100644 index 84e3e499b..000000000 --- a/qodana.yaml +++ /dev/null @@ -1,29 +0,0 @@ -#-------------------------------------------------------------------------------# -# Qodana analysis is configured by qodana.yaml file # -# https://www.jetbrains.com/help/qodana/qodana-yaml.html # -#-------------------------------------------------------------------------------# -version: "1.0" - -#Specify inspection profile for code analysis -profile: - name: qodana.starter - -#Enable inspections -#include: -# - name: