You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/modules/layers.rst
+185Lines changed: 185 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,6 +7,191 @@ For example, we do not provide layer for local response normalization, we sugges
7
7
you to apply ``tf.nn.lrn`` on ``Layer.outputs``.
8
8
More functions can be found in `TensorFlow API <https://www.tensorflow.org/versions/master/api_docs/index.html>`_
9
9
10
+
11
+
Understand layer
12
+
-----------------
13
+
14
+
All TensorLayer layers have a number of properties in common:
15
+
16
+
- ``layer.outputs`` : Tensor, the outputs of current layer.
17
+
- ``layer.all_params`` : a list of Tensor, all network variables in order.
18
+
- ``layer.all_layers`` : a list of Tensor, all network outputs in order.
19
+
- ``layer.all_drop`` : a dictionary of {placeholder : float}, all keeping probabilities of noise layer.
20
+
21
+
All TensorLayer layers have a number of methods in common:
22
+
23
+
- ``layer.print_params()`` : print the network variables information in order (after ``sess.run(tf.initialize_all_variables())``). alternatively, print all variables by ``tl.layers.print_all_variables()``.
24
+
- ``layer.print_layers()`` : print the network layers information in order.
25
+
- ``layer.count_params()`` : print the number of parameters in the network.
26
+
27
+
28
+
29
+
The initialization of a network is done by input layer, then we can stacked layers
30
+
as follow, then a network is a ``Layer`` class.
31
+
The most important properties of a network are ``network.all_params``, ``network.all_layers`` and ``network.all_drop``.
32
+
The ``all_params`` is a list which store all pointers of all network parameters in order,
33
+
the following script define a 3 layer network, then ``all_params = [W1, b1, W2, b2, W_out, b_out]``.
34
+
The ``all_layers`` is a list which store all pointers of the outputs of all layers,
35
+
in the following network, ``all_layers = [dropout(?, 784), relu(?, 800), dropout(?, 800), relu(?, 800), dropout(?, 800)], identity(?, 10)]``
36
+
where ``?`` reflects any batch size. You can print the layer information and parameters information by
37
+
using ``network.print_layers()`` and ``network.print_params()``.
38
+
To count the number of parameters in a network, run ``network.count_params()``.
39
+
40
+
41
+
42
+
.. code-block:: python
43
+
44
+
sess = tf.InteractiveSession()
45
+
46
+
x = tf.placeholder(tf.float32, shape=[None, 784], name='x')
0 commit comments