Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/166146
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (1 Unrelated Failure)As of commit 153e300 with merge base 8625ffb ( UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
| @@ -296,7 +300,7 @@ ARG torch_cuda_arch_list='8.0;8.9;9.0a;10.0a;12.0' | |||
| ENV TORCH_CUDA_ARCH_LIST=${torch_cuda_arch_list} | |||
|
|
|||
| ARG FLASHINFER_GIT_REPO="https://github.com/flashinfer-ai/flashinfer.git" | |||
There was a problem hiding this comment.
@huydhn
it sees like uv pip install --system /wheels/vllm/*.whl search for flashinfer 0.4.0 automatically. I might move the flashinfer wheel build before the vllm wheel build.
wonder if build flashinfer from source might not the case anymore? A little doubt about it
.github/ci_configs/vllm/Dockerfile
Outdated
|
|
||
| ARG FLASHINFER_GIT_REPO="https://github.com/flashinfer-ai/flashinfer.git" | ||
| ARG FLASHINFER_GIT_REF="v0.2.14.post1" | ||
| ARG FLASHINFER_GIT_REF="v0.4.1" |
There was a problem hiding this comment.
This might not work IIRC, you can do it later in the context of #164562 if you only want to fix trunk. We don't need to build flashinfer after #165274 (comment) lands and can just pip install it from pypi
There was a problem hiding this comment.
ah i see, sounds good!
huydhn
left a comment
There was a problem hiding this comment.
Let's keep v0.2.14.post1 in this PR https://github.com/pytorch/pytorch/pull/166146/files#r2457590754
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
FIx the vllm test build it's broken due to the flashinfer dependency