Skip to content

Conversation

@wanchaol
Copy link
Collaborator

@wanchaol wanchaol commented Nov 1, 2022

This PR moves the core DTensor abstraction to torch.distributed._tensor
folder, which includes the following:
1. DeviceMesh
2. Placement types
3. DTensor class
4. dispatching logic
5. redistribute logic

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Nov 1, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/88176

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit dabbdfc:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

…stributed"

This PR moves the core DTensor abstraction to torch.distributed._tensor
folder, which includes the following:
1. DeviceMesh
2. Placement types
3. DTensor class
4. dispatching logic
5. redistribute logic

[ghstack-poisoned]
…e distributed"

This PR moves the core DTensor abstraction and high level APIs to
torch.distributed._tensor folder, which includes the following:
1. DeviceMesh
2. Placement types
3. DTensor class
4. high level APIs (distribute_tensor/module)
5. dispatching logic
6. redistribute logic

[ghstack-poisoned]
@wanchaol wanchaol changed the title [dtensor] PART 1: move core DTensor abstraction to core distributed [dtensor] PART 1: move DTensor abstraction and APIs to core distributed Nov 2, 2022
…e distributed"

This PR moves the core DTensor abstraction and high level APIs to
torch.distributed._tensor folder, which includes the following:
1. DeviceMesh
2. Placement types
3. DTensor class
4. high level APIs (distribute_tensor/module)
5. dispatching logic
6. redistribute logic

[ghstack-poisoned]
…e distributed"

This PR moves the core DTensor abstraction and high level APIs to
torch.distributed._tensor folder, which includes the following:
1. DTensor class
2. high level APIs (distribute_tensor/module)
3. dispatching logic
4. redistribute logic

[ghstack-poisoned]
@wanchaol wanchaol changed the title [dtensor] PART 1: move DTensor abstraction and APIs to core distributed [dtensor] PART 2: move DTensor abstraction and APIs to core distributed Nov 6, 2022
@wanchaol wanchaol added the release notes: distributed (dtensor) release notes category label Nov 6, 2022
…e distributed"

This PR moves the core DTensor abstraction and high level APIs to
torch.distributed._tensor folder, which includes the following:
1. DTensor class
2. high level APIs (distribute_tensor/module)
3. dispatching logic
4. redistribute logic

[ghstack-poisoned]
…e distributed"

This PR moves the core DTensor abstraction and high level APIs to
torch.distributed._tensor folder, which includes the following:
1. DTensor class
2. high level APIs (distribute_tensor/module)
3. dispatching logic
4. redistribute logic

[ghstack-poisoned]
Copy link
Contributor

@fduwjj fduwjj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

…e distributed"


This PR moves the core DTensor abstraction and high level APIs to
torch.distributed._tensor folder, which includes the following:
1. DTensor class
2. high level APIs (distribute_tensor/module)
3. dispatching logic
4. redistribute logic

part of #88838

[ghstack-poisoned]
kulinseth pushed a commit to kulinseth/pytorch that referenced this pull request Dec 10, 2022
…ed (pytorch#88176)

This PR moves the core DTensor abstraction and high level APIs to
torch.distributed._tensor folder, which includes the following:
1. DTensor class
2. high level APIs (distribute_tensor/module)
3. dispatching logic
4. redistribute logic

part of pytorch#88838
Pull Request resolved: pytorch#88176
Approved by: https://github.com/fduwjj
@facebook-github-bot facebook-github-bot deleted the gh/wanchaol/204/head branch June 8, 2023 19:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants