Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vllm 0.7.3 requires torch==2.5.1, but you have torch 2.4.0+cu124 which is incompatible. #27

Open
zhufz opened this issue Feb 25, 2025 · 1 comment

Comments

@zhufz
Copy link

zhufz commented Feb 25, 2025

看readme里要求的是torch大于2.4.0就可以,但是vllm要求0.7.3,有些冲突

@RogersSteve
Copy link

你可以使用docker中后面给的命令安装,先安装vllm,会自动帮你安装上pytorch,然后根据docker后面的命令一步步执行

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants