Skip to content

vllm test build#166146

Closed
yangw-dev wants to merge 6 commits intomainfrom
fixvllmoct
Closed

vllm test build#166146
yangw-dev wants to merge 6 commits intomainfrom
fixvllmoct

Conversation

@yangw-dev
Copy link
Contributor

@yangw-dev yangw-dev commented Oct 23, 2025

FIx the vllm test build it's broken due to the flashinfer dependency

@pytorch-bot
Copy link

pytorch-bot bot commented Oct 23, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/166146

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 153e300 with merge base 8625ffb (image):

UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the topic: not user facing topic category label Oct 23, 2025
@@ -296,7 +300,7 @@ ARG torch_cuda_arch_list='8.0;8.9;9.0a;10.0a;12.0'
ENV TORCH_CUDA_ARCH_LIST=${torch_cuda_arch_list}

ARG FLASHINFER_GIT_REPO="https://github.com/flashinfer-ai/flashinfer.git"
Copy link
Contributor Author

@yangw-dev yangw-dev Oct 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@huydhn
it sees like uv pip install --system /wheels/vllm/*.whl search for flashinfer 0.4.0 automatically. I might move the flashinfer wheel build before the vllm wheel build.
wonder if build flashinfer from source might not the case anymore? A little doubt about it


ARG FLASHINFER_GIT_REPO="https://github.com/flashinfer-ai/flashinfer.git"
ARG FLASHINFER_GIT_REF="v0.2.14.post1"
ARG FLASHINFER_GIT_REF="v0.4.1"
Copy link
Contributor

@huydhn huydhn Oct 23, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might not work IIRC, you can do it later in the context of #164562 if you only want to fix trunk. We don't need to build flashinfer after #165274 (comment) lands and can just pip install it from pypi

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah i see, sounds good!

Copy link
Contributor

@huydhn huydhn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@yangw-dev yangw-dev requested a review from huydhn October 23, 2025 22:21
@yangw-dev yangw-dev marked this pull request as ready for review October 24, 2025 17:03
@yangw-dev yangw-dev requested a review from a team as a code owner October 24, 2025 17:03
@yangw-dev
Copy link
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Oct 24, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request ciflow/vllm Merged topic: not user facing topic category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants