Skip to content

Support SymInt placeholder in wrapper fxir#167757

Closed
nandesuka wants to merge 1 commit intopytorch:mainfrom
nandesuka:export-D86984100
Closed

Support SymInt placeholder in wrapper fxir#167757
nandesuka wants to merge 1 commit intopytorch:mainfrom
nandesuka:export-D86984100

Conversation

@nandesuka
Copy link
Contributor

@nandesuka nandesuka commented Nov 13, 2025

Summary:
add support for symint placeholders

added two test cases with dynamic reshape

  • dynamic info coming from tmd on placeholders
  • dynamic info coming from placeholders (symints)

Test Plan:
test_reshape_dynamic_ph
test_reshape_dynamic_tmd

Differential Revision: D86984100

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov @coconutruben

@meta-codesync
Copy link

meta-codesync bot commented Nov 13, 2025

@nandesuka has exported this pull request. If you are a Meta employee, you can view the originating Diff in D86984100.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/167757

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (2 Unrelated Failures)

As of commit 0badd5d with merge base 9ff95f6 (image):

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 13, 2025

The label module: inductor is only applicable to issues and has been removed. Please only use this label on issues.

nandesuka added a commit to nandesuka/pytorch that referenced this pull request Nov 14, 2025
Summary:

add support for symint placeholders

added two test cases with dynamic reshape
- dynamic info coming from tmd on placeholders
- dynamic info coming from placeholders (symints)

Test Plan:
test_reshape_dynamic_ph
test_reshape_dynamic_tmd

Differential Revision: D86984100
nandesuka added a commit to nandesuka/pytorch that referenced this pull request Nov 14, 2025
Summary:

add support for symint placeholders

added two test cases with dynamic reshape
- dynamic info coming from tmd on placeholders
- dynamic info coming from placeholders (symints)

Test Plan:
test_reshape_dynamic_ph
test_reshape_dynamic_tmd

Differential Revision: D86984100
nandesuka added a commit to nandesuka/pytorch that referenced this pull request Nov 14, 2025
Summary:

add support for symint placeholders

added two test cases with dynamic reshape
- dynamic info coming from tmd on placeholders
- dynamic info coming from placeholders (symints)

Test Plan:
test_reshape_dynamic_ph
test_reshape_dynamic_tmd

Differential Revision: D86984100
Copy link
Contributor

@blaine-rister blaine-rister left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, but I left a few style comments. I think there are a few linter issues as well.

if not config.fx_wrapper:
# Replace non-tensor inputs with Nones
# constant scalars are not being used anyways by the graph
# symbolic scalar (symint/symfloat) are not supported in non-fx_wrapper mode
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo

Suggested change
# symbolic scalar (symint/symfloat) are not supported in non-fx_wrapper mode
# symbolic scalars (symint/symfloat) are not supported in non-fx_wrapper mode

Comment on lines 2553 to 2554
if hasattr(fi, "device") and hasattr(i, "device"):
if fi.device != i.device:
Copy link
Contributor

@blaine-rister blaine-rister Nov 14, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In most pytorch code, I'm used to seeing isinstance instead of hasattr. That allows mypy to do better type checking. It seems like we could change the existing assert to a conditional.

Suggested change
if hasattr(fi, "device") and hasattr(i, "device"):
if fi.device != i.device:
if isinstance(i, torch.Tensor) and fi.device != i.device:

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Nov 14, 2025
nandesuka added a commit to nandesuka/pytorch that referenced this pull request Nov 14, 2025
Summary:

add support for symint placeholders

added two test cases with dynamic reshape
- dynamic info coming from tmd on placeholders
- dynamic info coming from placeholders (symints)

Test Plan:
test_reshape_dynamic_ph
test_reshape_dynamic_tmd

Reviewed By: blaine-rister

Differential Revision: D86984100
Comment on lines 2549 to 2553
if fi is not None:
assert isinstance(i, torch.Tensor)
if fi.device != i.device:
raise ValueError(
f"Device mismatch between fake input and example input at position #{idx}: "
f"{fi.device} vs {i.device}. If the model was exported via torch.export(), "
"make sure torch.export() and torch.aot_compile() run on the same device."
)
if isinstance(fi, torch.Tensor) and isinstance(i, torch.Tensor):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit:

if fi is not None and isinstance(fi, torch.Tensor):
    assert isinstance(i, torch.Tensor)

Summary:

add support for symint placeholders

added two test cases with dynamic reshape
- dynamic info coming from tmd on placeholders
- dynamic info coming from placeholders (symints)

Test Plan:
test_reshape_dynamic_ph
test_reshape_dynamic_tmd

Reviewed By: blaine-rister

Differential Revision: D86984100
@facebook-github-bot
Copy link
Contributor

@pytorchbot merge

(Initiating merge automatically since Phabricator Diff has merged)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants