Skip to content

[torch.onnx.export] Fix onnx export on big endian machines#167816

Closed
tungld wants to merge 2 commits intopytorch:mainfrom
tungld:fix-onnx-export-big-endian
Closed

[torch.onnx.export] Fix onnx export on big endian machines#167816
tungld wants to merge 2 commits intopytorch:mainfrom
tungld:fix-onnx-export-big-endian

Conversation

@tungld
Copy link
Contributor

@tungld tungld commented Nov 14, 2025

On big endian machines, constant values in the exported onnx model are still in big endian, they need to be converted to little endian to comply with the onnx specification.

This fixes that issue by calling super's methods of ir.Tensor that already handles endianness well.

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 14, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/167816

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit b65f821 with merge base 8f16199 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: onnx torch.onnx related changes that should show up in the release notes label Nov 14, 2025
).from_address(tensor.data_ptr())

def tobytes(self) -> bytes:
# On big-endian machines, call the super's tobytes() which returns a little-endian result.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this work for types like bfloat16?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@justinchuby I just tested with the following simple example, and it works for bfloat16.

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.weight = nn.Parameter(torch.tensor([5.0], dtype=torch.bfloat16))

    def forward(self, x):
        return x * self.weight 

@titaiwangms
Copy link
Collaborator

Could you fix the lint?

@mikaylagawarecki mikaylagawarecki added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Nov 19, 2025
@tungld
Copy link
Contributor Author

tungld commented Nov 19, 2025

Could you fix the lint?

@titaiwangms I didn't touch the file torch/_inductor/scheduler.py, and don't know why my change is relevant to that scheduler. Do you have any suggestion to fix it?

@titaiwangms
Copy link
Collaborator

Could you fix the lint?

@titaiwangms I didn't touch the file torch/_inductor/scheduler.py, and don't know why my change is relevant to that scheduler. Do you have any suggestion to fix it?

Try rebase maybe?

Signed-off-by: Tung D. Le <tung@jp.ibm.com>
Signed-off-by: Tung D. Le <tung@jp.ibm.com>
@tungld tungld force-pushed the fix-onnx-export-big-endian branch from 7cf68a8 to b65f821 Compare November 19, 2025 02:28
@tungld
Copy link
Contributor Author

tungld commented Nov 19, 2025

@titaiwangms Thanks for the suggestion! I rebased it. Hope it works.

@titaiwangms
Copy link
Collaborator

@pytorchbot merge

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 19, 2025

This PR needs to be approved by an authorized maintainer before merge.

@titaiwangms
Copy link
Collaborator

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Nov 19, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged open source release notes: onnx torch.onnx related changes that should show up in the release notes triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants