Skip to content

Conversation

@zasdfgbnm
Copy link
Collaborator

I also opened a PR on cmake upstream: https://gitlab.kitware.com/cmake/cmake/-/merge_requests/5292

@dr-ci
Copy link

dr-ci bot commented Sep 29, 2020

💊 CI failures summary and remediations

As of commit f63bd02 (more details on the Dr. CI page):


  • 2/2 failures possibly* introduced in this PR
    • 1/2 non-CircleCI failure(s)

🕵️ 1 new failure recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_windows_vs2019_py36_cuda10.1_test2 (1/1)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

RuntimeError: test_jit_cuda_fuser failed!
  test_const (__main__.TestCudaFuser) ... ok (0.173s) 
  test_dynamic_size (__main__.TestCudaFuser) ... ok (0.188s) 
  test_half (__main__.TestCudaFuser) ... ok (0.216s) 
  test_pw_single_reduction_partition (__main__.TestCudaFuser) ... ok (0.218s) 
  test_random_topo (__main__.TestCudaFuser) ... skip (0.010s) 
  test_reduction (__main__.TestCudaFuser) ... Traceback (most recent call last): 
  File "run_test.py", line 746, in <module> 
    main() 
  File "run_test.py", line 729, in main 
    raise RuntimeError(err_message) 
RuntimeError: test_jit_cuda_fuser failed! 
 
(base) circleci@PACKER-5F5FCBA1 C:\Users\circleci\project\test>if ERRORLEVEL 1 exit /b 1  
+ cleanup
+ retcode=1
+ set +x

ci.pytorch.org: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 3 times.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@ezyang
Copy link
Contributor

ezyang commented Sep 29, 2020

Do you think the JIT CUDA fuser problem is related to the diff? Seems suspicious...

@zasdfgbnm
Copy link
Collaborator Author

zasdfgbnm commented Sep 29, 2020

@ezyang I can not reproduce this locally, so I don't think it is related.

@facebook-github-bot
Copy link
Contributor

@ezyang merged this pull request in 0a15646.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants