Thanks to visit codestin.com Credit goes to github.com
We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
2 parents 38e6ef6 + be788b8 commit 196cc74Copy full SHA for 196cc74
1 file changed
beginner_source/examples_autograd/two_layer_net_autograd.py
@@ -46,7 +46,7 @@
46
47
# Compute and print loss using operations on Tensors.
48
# Now loss is a Tensor of shape (1,)
49
- # loss.item() gets the a scalar value held in the loss.
+ # loss.item() gets the scalar value held in the loss.
50
loss = (y_pred - y).pow(2).sum()
51
print(t, loss.item())
52
0 commit comments