Thanks to visit codestin.com Credit goes to github.com
We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent e665e9b commit 2d44031Copy full SHA for 2d44031
1 file changed
beginner_source/former_torchies/autograd_tutorial.py
@@ -14,7 +14,7 @@
14
15
In autograd, if any input ``Tensor`` of an operation has ``requires_grad=True``,
16
the computation will be tracked. After computing the backward pass, a gradient
17
-w.r.t. this variable is accumulated into ``.grad`` attribute.
+w.r.t. this tensor is accumulated into ``.grad`` attribute.
18
19
There’s one more class which is very important for autograd
20
implementation - a ``Function``. ``Tensor`` and ``Function`` are
0 commit comments