Support SymInt placeholder in wrapper fxir#167757
Support SymInt placeholder in wrapper fxir#167757nandesuka wants to merge 1 commit intopytorch:mainfrom
Conversation
|
@nandesuka has exported this pull request. If you are a Meta employee, you can view the originating Diff in D86984100. |
|
The label |
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/167757
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (2 Unrelated Failures)As of commit 0badd5d with merge base 9ff95f6 ( BROKEN TRUNK - The following job failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
The label |
f86b6b9 to
ac61dcd
Compare
Summary: add support for symint placeholders added two test cases with dynamic reshape - dynamic info coming from tmd on placeholders - dynamic info coming from placeholders (symints) Test Plan: test_reshape_dynamic_ph test_reshape_dynamic_tmd Differential Revision: D86984100
ac61dcd to
3d62352
Compare
Summary: add support for symint placeholders added two test cases with dynamic reshape - dynamic info coming from tmd on placeholders - dynamic info coming from placeholders (symints) Test Plan: test_reshape_dynamic_ph test_reshape_dynamic_tmd Differential Revision: D86984100
Summary: add support for symint placeholders added two test cases with dynamic reshape - dynamic info coming from tmd on placeholders - dynamic info coming from placeholders (symints) Test Plan: test_reshape_dynamic_ph test_reshape_dynamic_tmd Differential Revision: D86984100
3d62352 to
4f7f7e8
Compare
torch/_inductor/compile_fx.py
Outdated
| if not config.fx_wrapper: | ||
| # Replace non-tensor inputs with Nones | ||
| # constant scalars are not being used anyways by the graph | ||
| # symbolic scalar (symint/symfloat) are not supported in non-fx_wrapper mode |
There was a problem hiding this comment.
typo
| # symbolic scalar (symint/symfloat) are not supported in non-fx_wrapper mode | |
| # symbolic scalars (symint/symfloat) are not supported in non-fx_wrapper mode |
torch/_inductor/compile_fx.py
Outdated
| if hasattr(fi, "device") and hasattr(i, "device"): | ||
| if fi.device != i.device: |
There was a problem hiding this comment.
In most pytorch code, I'm used to seeing isinstance instead of hasattr. That allows mypy to do better type checking. It seems like we could change the existing assert to a conditional.
| if hasattr(fi, "device") and hasattr(i, "device"): | |
| if fi.device != i.device: | |
| if isinstance(i, torch.Tensor) and fi.device != i.device: |
4f7f7e8 to
49e98a5
Compare
Summary: add support for symint placeholders added two test cases with dynamic reshape - dynamic info coming from tmd on placeholders - dynamic info coming from placeholders (symints) Test Plan: test_reshape_dynamic_ph test_reshape_dynamic_tmd Reviewed By: blaine-rister Differential Revision: D86984100
torch/_inductor/compile_fx.py
Outdated
| if fi is not None: | ||
| assert isinstance(i, torch.Tensor) | ||
| if fi.device != i.device: | ||
| raise ValueError( | ||
| f"Device mismatch between fake input and example input at position #{idx}: " | ||
| f"{fi.device} vs {i.device}. If the model was exported via torch.export(), " | ||
| "make sure torch.export() and torch.aot_compile() run on the same device." | ||
| ) | ||
| if isinstance(fi, torch.Tensor) and isinstance(i, torch.Tensor): |
There was a problem hiding this comment.
nit:
if fi is not None and isinstance(fi, torch.Tensor):
assert isinstance(i, torch.Tensor)Summary: add support for symint placeholders added two test cases with dynamic reshape - dynamic info coming from tmd on placeholders - dynamic info coming from placeholders (symints) Test Plan: test_reshape_dynamic_ph test_reshape_dynamic_tmd Reviewed By: blaine-rister Differential Revision: D86984100
49e98a5 to
0badd5d
Compare
|
@pytorchbot merge (Initiating merge automatically since Phabricator Diff has merged) |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary:
add support for symint placeholders
added two test cases with dynamic reshape
Test Plan:
test_reshape_dynamic_ph
test_reshape_dynamic_tmd
Differential Revision: D86984100
cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @chenyang78 @kadeng @muchulee8 @amjames @chauhang @aakhundov @coconutruben