harden backed_size_oblivious and broadcast_shapes#167232
harden backed_size_oblivious and broadcast_shapes#167232laithsakka wants to merge 7 commits intogh/laithsakka/324/basefrom
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/167232
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 5559787 with merge base d144382 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
We probably need something similar for expand cc ezyang EikanWang jgong5 wenzhe-nrv [ghstack-poisoned]
torch/_refs/__init__.py
Outdated
| if a == 1 and b != 1: | ||
| torch._check(shape[idx] == 1) | ||
| if b == 1 and a != 1: | ||
| torch._check(common_shape[idx] == 1) |
There was a problem hiding this comment.
Just throwing a possible scenario at this
Suppose you have this simple model with backed_sized_oblivious:
def forward(user, items):
broadcasted = user.expand_as(items)
...
Now, take these three cases. If we trace with one of the 3 cases, will the compiled model still work on all 3 cases at runtime?
# case 1
user: [1, 1024]
items: [32, 1024]
# case 2
user: [32, 1024]
items: [1, 1024]
# case 3
user: [32, 1024]
items: [32, 1024]
There was a problem hiding this comment.
I would say NO
those require three different compilations unless we were able to make this works.
#165108
at least on the export layer, but idk if i trust inductor
There was a problem hiding this comment.
okay from our offline discussion:
it's possible to get through export, but inductor will face this same scenario and it must know whether to broadcast or not.
There was a problem hiding this comment.
I want to clarify a meta-point here. The semantics change from unbacked IS INTENTIONAL. If I have a program x + y where x: f32[u0] and y: f32[4], the idea behind unbacked is that the only valid runtime value for u0 here is u0 = 4. In eager, u0 = 1 would lead to broadcasting and would not raise an error, but this is kind of an insane thing to do and we would rather just immediately error at compile time (and force the user to do something like torch.cond to indicate that they either want to broadcast or not.)
torch/_refs/__init__.py
Outdated
| a = size_hint(shape[idx], allow_none=True) | ||
| b = size_hint(common_shape[idx], allow_none=True) |
There was a problem hiding this comment.
nit: pull this down underneath the backed_so block
with comment too
We probably need something similar for expand cc ezyang EikanWang jgong5 wenzhe-nrv [ghstack-poisoned]
We probably need something similar for expand cc ezyang EikanWang jgong5 wenzhe-nrv [ghstack-poisoned]
We probably need something similar for expand cc ezyang EikanWang jgong5 wenzhe-nrv [ghstack-poisoned]
|
@ColinPeppler address all comments |
test/test_dynamic_shapes.py
Outdated
| self.assertEqual(cnt.frame_count, 2) | ||
|
|
||
| instantiate_parametrized_tests(TestUnbacked) | ||
| instantiate_parametrized_tests(TestUnbacked) |
There was a problem hiding this comment.
dunno if it belongs at file scope, otherwise indent could throw things off
|
@pytorchbot merge |
We probably need something similar for expand cc ezyang EikanWang jgong5 wenzhe-nrv [ghstack-poisoned]
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 1 mandatory check(s) failed. The first few are: Dig deeper by viewing the failures on hud |
We probably need something similar for expand cc ezyang EikanWang jgong5 wenzhe-nrv [ghstack-poisoned]
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary: Support semantics when using backed_size_oblivious, similar to pytorch#167232 Earlier saw errors like ``` RuntimeError: non-broadcasting semantics require s67 == 41 While executing %expand : [num_users=1] = call_method[target=expand](args = (%reshape_5, -1, -1, %getitem_9), kwargs = {}) ``` Test Plan: test_dynamic_shapes: ``` test_backed_size_oblivious_expand (test_dynamic_shapes.TestUbackedOps) ... I1112 14:07:54.724596 1386932 Logger.cpp:995] Dropping logs in unit tests. ok ``` Differential Revision: D86902546
Summary: Support semantics when using backed_size_oblivious, similar to pytorch#167232 We see errors in a model exported with dynamic shapes, like ``` RuntimeError: non-broadcasting semantics require s67 == 41 While executing %expand : [num_users=1] = call_method[target=expand](args = (%reshape_5, -1, -1, %getitem_9), kwargs = {}) ``` Test Plan: test_dynamic_shapes: ``` test_backed_size_oblivious_expand (test_dynamic_shapes.TestUbackedOps) ... I1112 14:07:54.724596 1386932 Logger.cpp:995] Dropping logs in unit tests. ok ``` Differential Revision: D86902546 Pulled By: aneeshgupta42
Summary: Support semantics when using backed_size_oblivious, similar to pytorch#167232 We see errors in a model exported with dynamic shapes, like ``` RuntimeError: non-broadcasting semantics require s67 == 41 While executing %expand : [num_users=1] = call_method[target=expand](args = (%reshape_5, -1, -1, %getitem_9), kwargs = {}) ``` Test Plan: test_dynamic_shapes: ``` test_backed_size_oblivious_expand (test_dynamic_shapes.TestUbackedOps) ... I1112 14:07:54.724596 1386932 Logger.cpp:995] Dropping logs in unit tests. ok ``` Differential Revision: D86902546 Pulled By: aneeshgupta42
Summary: Support semantics when using backed_size_oblivious, similar to #167232 We see errors in a model exported with dynamic shapes, like ``` RuntimeError: non-broadcasting semantics require s67 == 41 While executing %expand : [num_users=1] = call_method[target=expand](args = (%reshape_5, -1, -1, %getitem_9), kwargs = {}) ``` Test Plan: test_dynamic_shapes: ``` test_backed_size_oblivious_expand (test_dynamic_shapes.TestUbackedOps) ... I1112 14:07:54.724596 1386932 Logger.cpp:995] Dropping logs in unit tests. ok ``` Differential Revision: D86902546 Pull Request resolved: #167689 Approved by: https://github.com/laithsakka
Summary: Support semantics when using backed_size_oblivious, similar to pytorch#167232 We see errors in a model exported with dynamic shapes, like ``` RuntimeError: non-broadcasting semantics require s67 == 41 While executing %expand : [num_users=1] = call_method[target=expand](args = (%reshape_5, -1, -1, %getitem_9), kwargs = {}) ``` Test Plan: test_dynamic_shapes: ``` test_backed_size_oblivious_expand (test_dynamic_shapes.TestUbackedOps) ... I1112 14:07:54.724596 1386932 Logger.cpp:995] Dropping logs in unit tests. ok ``` Differential Revision: D86902546 Pull Request resolved: pytorch#167689 Approved by: https://github.com/laithsakka
We probably need something similar for expand Pull Request resolved: pytorch#167232 Approved by: https://github.com/ColinPeppler
Summary: Support semantics when using backed_size_oblivious, similar to pytorch#167232 We see errors in a model exported with dynamic shapes, like ``` RuntimeError: non-broadcasting semantics require s67 == 41 While executing %expand : [num_users=1] = call_method[target=expand](args = (%reshape_5, -1, -1, %getitem_9), kwargs = {}) ``` Test Plan: test_dynamic_shapes: ``` test_backed_size_oblivious_expand (test_dynamic_shapes.TestUbackedOps) ... I1112 14:07:54.724596 1386932 Logger.cpp:995] Dropping logs in unit tests. ok ``` Differential Revision: D86902546 Pull Request resolved: pytorch#167689 Approved by: https://github.com/laithsakka
Stack from ghstack (oldest at bottom):
We probably need something similar for expand
cc @ezyang @EikanWang @jgong5 @wenzhe-nrv