Skip to content

Conversation

@anderspapitto
Copy link
Contributor

No description provided.

@anderspapitto anderspapitto changed the title reenable some more Werror usage [WIP] reenable some more Werror usage Jul 19, 2018
@colesbury
Copy link
Member

Wmaybe-uninitialized is pretty flaky, especially since we use a variety of GCC versions.

@ezyang
Copy link
Contributor

ezyang commented Jul 20, 2018

Maybe we should run Werror only on a single compiler configuration; either the latest (less likely to have bugs) or the compiler that all the devs use (so it's easy to repro locally).

@anderspapitto
Copy link
Contributor Author

@goldsborough thoughts on the comments of @colesbury and @ezyang ?

@anderspapitto anderspapitto changed the title [WIP] reenable some more Werror usage [WIP] remove unnecessary -Wno= flags Jul 20, 2018

This comment was marked as off-topic.

This comment was marked as off-topic.

@goldsborough
Copy link
Contributor

@colesbury do you have examples of the flakiness? I quenched this warning in all of ATen/PyTorch a couple months ago and only had to add (void)x to a few places. The primary issue was when code was #ifdef-ed out. I think quenching the edge cases with (void)x would be fine. I remember places in the autograd where integers were not initialized to zero (usually member variables) and we're just kind of lucky they always ended up being zero

I personally think keeping warnings on in all CI is better because otherwise they'll just stay silent until someone builds with a particular configuration (e.g. on the FAIR cluster or on macOs) on their local machine, and then we'll have to fix the warning spew on those configurations anyway

@anderspapitto anderspapitto changed the title [WIP] remove unnecessary -Wno= flags remove unnecessary -Wno= flags Jul 20, 2018
@anderspapitto
Copy link
Contributor Author

I consider this diff ready to merge

@colesbury
Copy link
Member

It doesn't work well with tagged enums like in Scalar.h. The current code doesn't issue a warning for very subtle reasons. Slight changes to the code (like moving a method inline) will break the build.

It also fails with things like

size[Dim - 1] *= t->size(i);

I'm not sure what (void)x has to do with -Wmaybe-uninitialized. maybe-uninitialized doesn't check for unused variables.

The heuristics used depend on compiler version. For example, locally with GCC 4.8.1 I see warnings about that BatchNormalization line, but later versions of GCC are slightly smarter. The reverse is often true too.

We work really hard to avoid flakiness in unit tests. We should take the same approach when dealing with compiler warnings, particularly when they are treated as errors.

The -Wuninitialized warning is much better. We have that on and should keep that on.

@goldsborough
Copy link
Contributor

Ok, sounds like good reasons to disable the warning. My brain also confused maybe-uninitialized with maybe-unused, the (void)x argument doesn't apply -- my bad.

@anderspapitto please disable -Wmaybe-uninitialized

@anderspapitto
Copy link
Contributor Author

reintroduced -Wno-maybe-uninitialized

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@anderspapitto has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@ezyang
Copy link
Contributor

ezyang commented Jul 23, 2018

@goldsborough @colesbury maybe-uninitialized is twinging on @resistor's PR at #9640 so it sounds like it is not currently disabled for all code.

@anderspapitto
Copy link
Contributor Author

@ezyang looks like that's in ATen, which has a separate cmake file, not related to this pr

@anderspapitto anderspapitto force-pushed the cleanup-warnings branch 3 times, most recently from 83f5627 to ab62d38 Compare July 24, 2018 02:30
target_compile_options(torch PRIVATE
# Considered to be flaky. See the discussion at
# https://github.com/pytorch/pytorch/pull/9608
-Wno-maybe-uninitialized)

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@anderspapitto is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@anderspapitto anderspapitto deleted the cleanup-warnings branch July 26, 2018 18:10
jramseyer pushed a commit to jramseyer/pytorch that referenced this pull request Jul 30, 2018
Summary: Pull Request resolved: pytorch#9608

Differential Revision: D8946664

Pulled By: anderspapitto

fbshipit-source-id: b05f10af58da25b2a2588f7153f393bb3637f29a
goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary: Pull Request resolved: pytorch#9608

Differential Revision: D8946664

Pulled By: anderspapitto

fbshipit-source-id: b05f10af58da25b2a2588f7153f393bb3637f29a
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants