-
Notifications
You must be signed in to change notification settings - Fork 26.3k
[Pytorch] add an option to disable TORCH_WARN and TORCH_WARN_ONCE log #87188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/87188
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit f6ce0cb: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D40321550 |
|
Looks good to me, but there's a recent merge conflict |
68ae27d to
b51436f
Compare
|
This pull request was exported from Phabricator. Differential Revision: D40321550 |
|
@kurtamohler mind taking another look ? thanks ! |
|
Looks good, thanks! Just to make sure you're aware, you can filter all warnings at the Python level like this: import warnings
warnings.filterwarnings('ignore')But this also filters out all warnings in your project, not just the PyTorch ones, so I can see why this There is a loose plan to add a warning filter to the C++ side of PyTorch (part of pytorch/rfcs#43). The details aren't decided yet, but we could potentially add a Python binding for it. If/when that exists, you could use that instead of having to build PyTorch yourself |
|
Thanks for the tip. We're the consumer of the model and we don't directly work on the python code. We're working with our partner to fix/adjust warning, but it would be great if we also have log/warning config on c++ side. |
|
not sure if the workflow failure is caused by my change, is there anything I can do here ? |
|
Yeah these failures don't seem to be related. Try rebasing just to make sure |
b51436f to
42daaec
Compare
|
This pull request was exported from Phabricator. Differential Revision: D40321550 |
|
Could anyone help me run the workflows ? |
|
This pull request was exported from Phabricator. Differential Revision: D40321550 |
42daaec to
79b1d8c
Compare
|
The two failing jobs actually do seem to be caused by your changes. I think I may have steered you wrong earlier--I had only seen the I guess the macros are maybe acting slightly differently on windows builds? Not sure |
79b1d8c to
b22e888
Compare
|
This pull request was exported from Phabricator. Differential Revision: D40321550 |
1 similar comment
|
This pull request was exported from Phabricator. Differential Revision: D40321550 |
b22e888 to
f0b7a8b
Compare
…pytorch#87188) Summary: Pull Request resolved: pytorch#87188 Add an option to disable TORCH_WARN, some op could trigger spammy TOCH_WARN log which is not desired under certain scenario. Test Plan: Tested with -pt.disable_warn = 1 and -pt.disable_warn = 0 verified TORCH_WARN and TORCH_WARN_ONCE are properly handled tested with -pt.strip_error_messages = 1, -pt.disable_warn = 0 verified strip error message is respected when warn is printed Differential Revision: D40321550 fbshipit-source-id: 74e0997f7271336543bad2691e327006d6cda7f9
|
This pull request was exported from Phabricator. Differential Revision: D40321550 |
f0b7a8b to
f6ce0cb
Compare
|
@kurtamohler mind taking another look(also trigger the workflow) ? It indeed seems to be a subtle preprocessor bug in MVSC(gcc, clang works ok). |
|
gentle ping @kurtamohler ; ) |
|
@pytorchbot merge |
Merge failedReason: Approval needed from one of the following (Rule 'superuser'): Details for Dev Infra teamRaised by workflow job |
|
Oh right, forgot we need @ezyang's approval before merging |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…pytorch#87188) Summary: Add an option to disable TORCH_WARN, some op could trigger spammy TOCH_WARN log which is not desired under certain scenario. Test Plan: Tested with -pt.disable_warn = 1 and -pt.disable_warn = 0 verified TORCH_WARN and TORCH_WARN_ONCE are properly handled tested with -pt.strip_error_messages = 1, -pt.disable_warn = 0 verified strip error message is respected when warn is printed Differential Revision: D40321550 Pull Request resolved: pytorch#87188 Approved by: https://github.com/kurtamohler, https://github.com/ezyang
Summary: Add an option to disable TORCH_WARN, some op could trigger spammy TOCH_WARN log which is not desired under certain scenario.
Test Plan:
Tested with
-pt.disable_warn = 1 and -pt.disable_warn = 0
verified TORCH_WARN and TORCH_WARN_ONCE are properly handled
tested with
-pt.strip_error_messages = 1, -pt.disable_warn = 0
verified strip error message is respected when warn is printed
Differential Revision: D40321550