Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit cf012b5

Browse files
Fix broken link to Autograd tutorial (#3757)
This PR fixes a broken link in the Optimization tutorial that pointed to a non-existent Autograd page. cc @svekars @sekyondaMeta @AlannaBurke Co-authored-by: sekyondaMeta <[email protected]>
1 parent 6e608db commit cf012b5

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

beginner_source/basics/optimization_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
Now that we have a model and data it's time to train, validate and test our model by optimizing its parameters on
1616
our data. Training a model is an iterative process; in each iteration the model makes a guess about the output, calculates
1717
the error in its guess (*loss*), collects the derivatives of the error with respect to its parameters (as we saw in
18-
the `previous section <autograd_tutorial.html>`_), and **optimizes** these parameters using gradient descent. For a more
18+
the `previous section <autogradqs_tutorial.html>`_), and **optimizes** these parameters using gradient descent. For a more
1919
detailed walkthrough of this process, check out this video on `backpropagation from 3Blue1Brown <https://www.youtube.com/watch?v=tIeHLnjs5U8>`__.
2020
2121
Prerequisite Code

0 commit comments

Comments
 (0)