Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit a5a3c24

Browse files
committed
Fix gap
1 parent da91094 commit a5a3c24

2 files changed

Lines changed: 6 additions & 9 deletions

File tree

index.rst

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -670,11 +670,8 @@ Additional Resources
670670
:hidden:
671671
:caption: Extending PyTorch
672672

673-
<<<<<<< HEAD
674-
intermediate/custom_function_conv_bn_tutorial
675-
=======
676673
intermediate/custom_function_double_backward
677-
>>>>>>> 4749c7db6... Add images
674+
intermediate/custom_function_conv_bn_tutorial
678675
advanced/cpp_extension
679676
advanced/torch_script_custom_ops
680677
advanced/torch_script_custom_classes

intermediate_source/custom_function_double_backward_tutorial.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
# we can explore a couple examples:
3232

3333
######################################################################
34-
# Saving the inputs
34+
# Saving the Inputs
3535
# -------------------------------------------------------------------
3636
# Consider this simple squaring function. It saves an input tensor
3737
# for backward. Double backward works automatically when autograd
@@ -168,7 +168,7 @@ def sinh(x):
168168

169169
######################################################################
170170
# Use torchviz to visualize the graph:
171-
171+
#
172172
# .. code-block:: python
173173
#
174174
# out = sinh(x)
@@ -205,7 +205,7 @@ def backward(ctx, grad_out):
205205
######################################################################
206206
# Use torchviz to visualize the graph. Notice that `grad_x` is not
207207
# part of the graph!
208-
208+
#
209209
# .. code-block:: python
210210
#
211211
# out = SinhBad.apply(x)
@@ -216,7 +216,7 @@ def backward(ctx, grad_out):
216216
# :width: 232
217217

218218
######################################################################
219-
# When Backward is not able to be Tracked by Autograd
219+
# When Backward is not Tracked
220220
# -------------------------------------------------------------------
221221
# Finally, let's consider an example when it may not be possible for
222222
# autograd to track gradients for a functions backward at all.
@@ -269,7 +269,7 @@ def backward(ctx, grad_out):
269269

270270
######################################################################
271271
# Use torchviz to visualize the graph:
272-
272+
#
273273
# .. code-block:: python
274274
#
275275
# out = Cube.apply(x)

0 commit comments

Comments
 (0)