Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 031a5a4

Browse files
authored
Merge pull request tensorflow#5283 from nightscape/patch-1
Fix several Markdown links in slim README
2 parents 0a16112 + db223fc commit 031a5a4

File tree

1 file changed

+19
-29
lines changed

1 file changed

+19
-29
lines changed

research/inception/inception/slim/README.md

Lines changed: 19 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,7 @@ keeping a model's architecture transparent and its hyperparameters explicit.
99
## Teaser
1010

1111
As a demonstration of the simplicity of using TF-Slim, compare the simplicity of
12-
the code necessary for defining the entire [VGG]
13-
(http://www.robots.ox.ac.uk/~vgg/research/very_deep/) network using TF-Slim to
12+
the code necessary for defining the entire [VGG](http://www.robots.ox.ac.uk/~vgg/research/very_deep/) network using TF-Slim to
1413
the lengthy and verbose nature of defining just the first three layers (out of
1514
16) using native tensorflow:
1615

@@ -61,14 +60,12 @@ def vgg16(inputs):
6160
TF-Slim offers several advantages over just the built-in tensorflow libraries:
6261

6362
* Allows one to define models much more compactly by eliminating boilerplate
64-
code. This is accomplished through the use of [argument scoping](scopes.py)
65-
and numerous high level [operations](ops.py). These tools increase
63+
code. This is accomplished through the use of [argument scoping](./scopes.py)
64+
and numerous high level [operations](./ops.py). These tools increase
6665
readability and maintainability, reduce the likelihood of an error from
6766
copy-and-pasting hyperparameter values and simplifies hyperparameter tuning.
68-
* Makes developing models simple by providing commonly used [loss functions]
69-
(losses.py)
70-
* Provides a concise [definition](inception_model.py) of [Inception v3]
71-
(http://arxiv.org/abs/1512.00567) network architecture ready to be used
67+
* Makes developing models simple by providing commonly used [loss functions](./losses.py)
68+
* Provides a concise [definition](./inception_model.py) of [Inception v3](http://arxiv.org/abs/1512.00567) network architecture ready to be used
7269
out-of-the-box or subsumed into new models.
7370

7471
Additionally TF-Slim was designed with several principles in mind:
@@ -192,19 +189,19 @@ roughly correspond to such layers. These include:
192189

193190
Layer | TF-Slim Op
194191
--------------------- | ------------------------
195-
Convolutional Layer | [ops.conv2d](ops.py)
196-
Fully Connected Layer | [ops.fc](ops.py)
197-
BatchNorm layer | [ops.batch_norm](ops.py)
198-
Max Pooling Layer | [ops.max_pool](ops.py)
199-
Avg Pooling Layer | [ops.avg_pool](ops.py)
200-
Dropout Layer | [ops.dropout](ops.py)
192+
Convolutional Layer | [ops.conv2d](./ops.py)
193+
Fully Connected Layer | [ops.fc](./ops.py)
194+
BatchNorm layer | [ops.batch_norm](./ops.py)
195+
Max Pooling Layer | [ops.max_pool](./ops.py)
196+
Avg Pooling Layer | [ops.avg_pool](./ops.py)
197+
Dropout Layer | [ops.dropout](./ops.py)
201198

202199
[ops.py](./ops.py) also includes operations that are not really "layers" per se,
203200
but are often used to manipulate hidden unit representations during inference:
204201

205202
Operation | TF-Slim Op
206203
--------- | ---------------------
207-
Flatten | [ops.flatten](ops.py)
204+
Flatten | [ops.flatten](./ops.py)
208205

209206
TF-Slim also provides a meta-operation called `repeat_op` that allows one to
210207
repeatedly perform the same operation. Consider the following snippet from the
@@ -244,12 +241,9 @@ number. More concretely, the scopes in the example above would be 'conv3_1',
244241

245242
### Scopes
246243

247-
In addition to the types of scope mechanisms in TensorFlow ([name_scope]
248-
(https://www.tensorflow.org/api_docs/python/framework.html#name_scope),
249-
[variable_scope]
250-
(https://www.tensorflow.org/api_docs/python/state_ops.html#variable_scope),
251-
TF-Slim adds a new scoping mechanism called "argument scope" or [arg_scope]
252-
(scopes.py). This new scope allows a user to specify one or more operations and
244+
In addition to the types of scope mechanisms in TensorFlow ([name_scope](https://www.tensorflow.org/api_docs/python/framework.html#name_scope),
245+
[variable_scope](https://www.tensorflow.org/api_docs/python/state_ops.html#variable_scope),
246+
TF-Slim adds a new scoping mechanism called "argument scope" or [arg_scope](./scopes.py). This new scope allows a user to specify one or more operations and
253247
a set of arguments which will be passed to each of the operations defined in the
254248
`arg_scope`. This functionality is best illustrated by example. Consider the
255249
following code snippet:
@@ -439,8 +433,7 @@ let TF-Slim know about the additional loss and let TF-Slim handle the losses.
439433
## Putting the Pieces Together
440434

441435
By combining TF-Slim Variables, Operations and scopes, we can write a normally
442-
very complex network with very few lines of code. For example, the entire [VGG]
443-
(https://www.robots.ox.ac.uk/~vgg/research/very_deep/) architecture can be
436+
very complex network with very few lines of code. For example, the entire [VGG](https://www.robots.ox.ac.uk/~vgg/research/very_deep/) architecture can be
444437
defined with just the following snippet:
445438

446439
```python
@@ -494,12 +487,9 @@ with tf.Session() as sess:
494487
...
495488
```
496489

497-
See [Restoring Variables]
498-
(https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html#restoring-variables)
499-
and [Choosing which Variables to Save and Restore]
500-
(https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html#choosing-which-variables-to-save-and-restore)
501-
sections of the [Variables]
502-
(https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html) page for
490+
See [Restoring Variables](https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html#restoring-variables)
491+
and [Choosing which Variables to Save and Restore](https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html#choosing-which-variables-to-save-and-restore)
492+
sections of the [Variables](https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html) page for
503493
more details.
504494

505495
### Using slim.variables to Track which Variables need to be Restored

0 commit comments

Comments
 (0)