Skip to content

Conversation

@wanchaol
Copy link
Collaborator

@wanchaol wanchaol commented Dec 13, 2022

This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Dec 13, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/90732

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 Failures

As of commit 679cff5:

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
@wanchaol wanchaol added the release notes: distributed (dtensor) release notes category label Dec 20, 2022
This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
@wanchaol wanchaol changed the title [dtensor][2/N] move OpSchema and types to a separate file [dtensor][3/N] move OpSchema and types to a separate file Jan 6, 2023
@wanchaol wanchaol requested review from XilunWu, aazzolini and fduwjj and removed request for pritamdamania87 January 6, 2023 23:07
Copy link
Contributor

@XilunWu XilunWu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
@wanchaol wanchaol added the ciflow/trunk Trigger trunk jobs on your pull request label Jan 18, 2023
@wanchaol
Copy link
Collaborator Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: Command git -C /home/runner/work/pytorch/pytorch cherry-pick -x 23a70e17aa1fbadf74e911db65bf9ace038d5315 returned non-zero exit code 1

Auto-merging test/distributed/_tensor/test_device_mesh.py
Auto-merging torch/distributed/_tensor/device_mesh.py
CONFLICT (content): Merge conflict in torch/distributed/_tensor/device_mesh.py
error: could not apply 23a70e17aa... [dtensor][1/N] add __hash__ to device_mesh and dtensor_spec
hint: After resolving the conflicts, mark them with
hint: "git add/rm <pathspec>", then run
hint: "git cherry-pick --continue".
hint: You can instead skip this commit with "git cherry-pick --skip".
hint: To abort and get back to the state before "git cherry-pick",
hint: run "git cherry-pick --abort".
Details for Dev Infra team Raised by workflow job

This PR moves OpSchema and types to a separate file to resolve
circular dependency better, this is part of refactor on dispatching
logic to enable more complicated features

[ghstack-poisoned]
@wanchaol
Copy link
Collaborator Author

@pytorchbot merge -f "failure not related"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@facebook-github-bot facebook-github-bot deleted the gh/wanchaol/236/head branch June 8, 2023 19:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: distributed (dtensor) release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants