Skip to content

Conversation

@yjmm10
Copy link
Contributor

@yjmm10 yjmm10 commented Aug 11, 2025

support install vllm with cuda 11.8 and cuda 12.8

yjmm10 added 2 commits August 11, 2025 15:13
support install vllm with cuda 11.8 and cuda 12.8
support vllm=0.10.0, and test in RTX 5090
@yjmm10
Copy link
Contributor Author

yjmm10 commented Aug 11, 2025

#71

@ygfrancois
Copy link
Collaborator

support install vllm with cuda 11.8 and cuda 12.8

这里官方镜像打的是vllm0.9.1,虽然hf里的代码以及支持了vllm0.10.0,但是担心使用原始镜像会有问题,这里是不是默认还是用0.9.1的原始setting,同时给一个根据环境版本做选择的option?

@ygfrancois ygfrancois closed this Oct 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants