We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatGLM3的requirements.txt要求"transformers==4.40.0"以及"vllm>=0.4.2",但最新版的vllm(0.5.3)的requirements文件要求"transformers>=4.42.4",存在冲突。建议要么放宽ChatGLM对transformers的版本限制,要么增加对vllm版本的限制。
另外,vllm不同版本对相应的pytorch版本有更严格的限制(CUDA版链接),因此可以考虑删除ChatGLM3对pytorch版本的限制或按照vllm对应的pytorch版本进行更新。
The text was updated successfully, but these errors were encountered:
这个代码在GLM-4出来后管理就比较少了,我到时候看解决一下
Sorry, something went wrong.
怎么删除?我现在的问题是能够运行glm,但是无法运行api_service.py,更新tans的话,就会反过来出错!
那对于这个版本不兼容的问题,怎么解决呢,现在好像装不了vllm==0.6.0以下的版本了
zRzRzRzRzRzRzR
No branches or pull requests
ChatGLM3的requirements.txt要求"transformers==4.40.0"以及"vllm>=0.4.2",但最新版的vllm(0.5.3)的requirements文件要求"transformers>=4.42.4",存在冲突。建议要么放宽ChatGLM对transformers的版本限制,要么增加对vllm版本的限制。
另外,vllm不同版本对相应的pytorch版本有更严格的限制(CUDA版链接),因此可以考虑删除ChatGLM3对pytorch版本的限制或按照vllm对应的pytorch版本进行更新。
The text was updated successfully, but these errors were encountered: