Skip to content

Conversation

@supriyar
Copy link
Contributor

@supriyar supriyar commented Oct 13, 2020

Stack from ghstack:

Summary:
Previously the type of GetAttr nodes was getting set incorrectly and wasn't matching the module type

Test Plan:
Existing quantization tests

Reviewers:

Subscribers:

Tasks:

Tags:

Differential Revision: D24279872

Summary:
Previously the type of GetAttr nodes was getting set incorrectly and wasn't matching the module type

Test Plan:
Existing quantization tests

Reviewers:

Subscribers:

Tasks:

Tags:

[ghstack-poisoned]
@supriyar supriyar requested a review from apaszke as a code owner October 13, 2020 16:26
supriyar added a commit that referenced this pull request Oct 13, 2020
Summary:
Previously the type of GetAttr nodes was getting set incorrectly and wasn't matching the module type

Test Plan:
Existing quantization tests

Reviewers:

Subscribers:

Tasks:

Tags:

ghstack-source-id: c9f5919
Pull Request resolved: #46250
@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Oct 13, 2020
Copy link
Contributor

@jerryzh168 jerryzh168 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does it work?

@supriyar
Copy link
Contributor Author

does it work?

Yes, this fixed the bug where the InterfaceType wasn't getting set correctly. Thanks!

@dr-ci
Copy link

dr-ci bot commented Oct 13, 2020

💊 CI failures summary and remediations

As of commit f56b463 (more details on the Dr. CI page):


  • 2/2 failures introduced in this PR

🕵️ 1 new failure recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_linux_bionic_py3_8_gcc9_coverage_test (1/1)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Oct 13 18:29:52 [E request_callback_no_python.cpp:581] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future
Oct 13 18:29:52 At: 
Oct 13 18:29:52   /opt/conda/lib/python3.8/site-packages/torch/distributed/rpc/internal.py(94): serialize 
Oct 13 18:29:52   /opt/conda/lib/python3.8/site-packages/torch/distributed/rpc/internal.py(146): serialize 
Oct 13 18:29:52  
Oct 13 18:29:52 [E request_callback_no_python.cpp:581] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Oct 13 18:29:52  
Oct 13 18:29:52 At: 
Oct 13 18:29:52   /opt/conda/lib/python3.8/site-packages/torch/distributed/rpc/internal.py(94): serialize 
Oct 13 18:29:52   /opt/conda/lib/python3.8/site-packages/torch/distributed/rpc/internal.py(146): serialize 
Oct 13 18:29:52  
Oct 13 18:29:52 [E request_callback_no_python.cpp:581] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Oct 13 18:29:52  
Oct 13 18:29:52 At: 
Oct 13 18:29:52   /opt/conda/lib/python3.8/site-packages/torch/distributed/rpc/internal.py(94): serialize 
Oct 13 18:29:52   /opt/conda/lib/python3.8/site-packages/torch/distributed/rpc/internal.py(146): serialize 
Oct 13 18:29:52  
Oct 13 18:29:52 [W tensorpipe_agent.cpp:504] RPC agent for worker3 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 
Oct 13 18:29:52 [W tensorpipe_agent.cpp:504] RPC agent for worker2 encountered error when reading incoming request from worker1: EOF: end of file (this is expected to happen during shutdown) 
Oct 13 18:29:52 [W tensorpipe_agent.cpp:504] RPC agent for worker0 encountered error when reading incoming request from worker3: EOF: end of file (this is expected to happen during shutdown) 
Oct 13 18:29:52 ok (1.541s) 
Oct 13 18:29:53   test_return_future_remote (__main__.TensorPipeRpcTestWithSpawn) ... [W tensorpipe_agent.cpp:504] RPC agent for worker3 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 

1 failure not recognized by patterns:

Job Step Action
CircleCI pytorch_cpp_doc_build Doc Build and Push 🔁 rerun

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 2 times.

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 95ccf34.

facebook-github-bot pushed a commit that referenced this pull request Oct 15, 2020
Summary:
Fixes #45902 by reverting #42457

The test case introduced by #42457 was fixed by #46250, which I'm assuming is the real source of the bug.

In the future it would be good to provide repro's for freezing issues without including a quantization dependency; there was another another issue in freezing (see: #46054) who's root cause was the same quantization issue #46250.

Pull Request resolved: #46285

Reviewed By: bdhirsh

Differential Revision: D24288739

Pulled By: eellison

fbshipit-source-id: b69ee8c713f749cd93d5eba370c3eafed86568bb
@facebook-github-bot facebook-github-bot deleted the gh/supriyar/198/head branch October 17, 2020 14:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants