-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Add inline call to profiling executor pipeline #43636
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit f1e53ba (more details on the Dr. CI page): ✅ None of the CI failures appear to be your fault 💚
❄️ 1 failure tentatively classified as flakybut reruns have not yet been triggered to confirm:
|
|
I'm not sure I fully understand motivation of adding these passes (even after re-reading the description in the commit message). Could you please elaborate a bit more on why it wasn't needed before and why we need it now? |
|
|
||
| void ProfilingGraphExecutorImpl::runProfilingInsensitiveOptimizations( | ||
| std::shared_ptr<Graph>& graph) { | ||
| Inline(*graph); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit: might be worth adding a comment why we need it. I assume it's because we're now planning to generate function calls for fallbacks (if I didn't know that I'd assume this pass was useless)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We weren't running inlining in the forward graph of differentiable subgraphs at all
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Previously we would have
%514 : Function = prim::Constant[name="AD_mm_backward_self"]()
grad_self.54 : Tensor = prim::CallFunction(%514, %868, %17) # <string>:204:28
| runNoGradOptimizations(copy); | ||
| } | ||
| EliminateDeadCode(copy); | ||
| ClearProfilingInformation(copy); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we now need this?
Also, please add a GRAPH_DUMPstatement before it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, I think this was added before the rebase, it's not needed since the tensorexpr fuser will remove the profiles.
We weren't running inlining in the forward graph of differentiable subgraphs, and we weren't getting rid of all profiles as part of optimization. Differential Revision: [D23358804](https://our.internmc.facebook.com/intern/diff/D23358804) [ghstack-poisoned]
ZolotukhinM
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
✌️
Krovatkin
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
![]()
We weren't running inlining in the forward graph of differentiable subgraphs. Differential Revision: [D23358804](https://our.internmc.facebook.com/intern/diff/D23358804) [ghstack-poisoned]
We weren't running inlining in the forward graph of differentiable subgraphs. Differential Revision: [D23358804](https://our.internmc.facebook.com/intern/diff/D23358804) [ghstack-poisoned]
We weren't running inlining in the forward graph of differentiable subgraphs. Differential Revision: [D23358804](https://our.internmc.facebook.com/intern/diff/D23358804) [ghstack-poisoned]
We weren't running inlining in the forward graph of differentiable subgraphs. Differential Revision: [D23358804](https://our.internmc.facebook.com/intern/diff/D23358804) [ghstack-poisoned]
We weren't running inlining in the forward graph of differentiable subgraphs. Differential Revision: [D23358804](https://our.internmc.facebook.com/intern/diff/D23358804) [ghstack-poisoned]
We weren't running inlining in the forward graph of differentiable subgraphs. Differential Revision: [D23358804](https://our.internmc.facebook.com/intern/diff/D23358804) [ghstack-poisoned]
We weren't running inlining in the forward graph of differentiable subgraphs. Differential Revision: [D23358804](https://our.internmc.facebook.com/intern/diff/D23358804) [ghstack-poisoned]
Stack from ghstack:
We weren't running inlining in the forward graph of differentiable subgraphs.
Differential Revision: D23358804