@@ -251,7 +251,7 @@ differentiation. This is something the PyTorch team is working on, but it is
251251not available yet. As such, we have to also implement the backward pass of our
252252LLTM, which computes the derivative of the loss with respect to each input of
253253the forward pass. Ultimately, we will plop both the forward and backward
254- function into a :class: `torch.nn .Function ` to create a nice Python binding. The
254+ function into a :class: `torch.autograd .Function ` to create a nice Python binding. The
255255backward function is slightly more involved, so we'll not dig deeper into the
256256code (if you are interested, `Alex Graves' thesis
257257<http://www.cs.toronto.edu/~graves/phd.pdf> `_ is a good read for more
@@ -415,7 +415,7 @@ matches our C++ code::
415415 LLTM forward
416416
417417Since we are now able to call our C++ functions from Python, we can wrap them
418- with :class: `torch.nn .Function ` and :class: `torch.nn.Module ` to make them first
418+ with :class: `torch.autograd .Function ` and :class: `torch.nn.Module ` to make them first
419419class citizens of PyTorch::
420420
421421 import math
@@ -424,7 +424,7 @@ class citizens of PyTorch::
424424 # Our module!
425425 import lltm
426426
427- class LLTMFunction(torch.nn .Function):
427+ class LLTMFunction(torch.autograd .Function):
428428 @staticmethod
429429 def forward(ctx, input, weights, bias, old_h, old_cell):
430430 outputs = lltm.forward(input, weights, bias, old_h, old_cell)
0 commit comments