Skip to content

Commit a49367e

Browse files
mfkasim1facebook-github-bot
authored andcommitted
Update the docs of torch.eig about derivative (#47598)
Summary: Related: #33090 I just realized that I haven't updated the docs of `torch.eig` when implementing the backward. Here's the PR updating the docs about the grad of `torch.eig`. cc albanD Pull Request resolved: #47598 Reviewed By: heitorschueroff Differential Revision: D24829373 Pulled By: albanD fbshipit-source-id: 89963ce66b2933e6c34e2efc93ad0f2c3dd28c68
1 parent 4159191 commit a49367e

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

torch/_torch_docs.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2640,7 +2640,7 @@ def merge_dicts(*dicts):
26402640
differently than dot(a, b). If the first argument is complex, the complex conjugate of the
26412641
first argument is used for the calculation of the dot product.
26422642
2643-
.. note::
2643+
.. note::
26442644
26452645
Unlike NumPy's vdot, torch.vdot intentionally only supports computing the dot product
26462646
of two 1D tensors with the same number of elements.
@@ -2672,7 +2672,7 @@ def merge_dicts(*dicts):
26722672
26732673
.. note::
26742674
Since eigenvalues and eigenvectors might be complex, backward pass is supported only
2675-
for :func:`torch.symeig`
2675+
if eigenvalues and eigenvectors are all real valued.
26762676
26772677
Args:
26782678
input (Tensor): the square matrix of shape :math:`(n \times n)` for which the eigenvalues and eigenvectors

0 commit comments

Comments
 (0)