Skip to content

Commit 82862cb

Browse files
committed
update readme
1 parent 832461a commit 82862cb

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -73,13 +73,13 @@ class ModelB(BaseModel):
7373
self.endnode = tg.EndNode(prev=[hn])
7474
```
7575

76-
creating a layer only created all the `Variables`. To connect the `Variables` into a graph, you can do a `_train_fprop(X)` or `_test_fprop(X)` to create the tensorflow graph. By abstracting `Variable` creation away from linking the `Variable` nodes into graph prevent the problem of certain tensorflow layers that always reinitialise its weights when it's called, example the [`tf.nn.batch_normalization`](https://www.tensorflow.org/api_docs/python/tf/nn/batch_normalization) layer. Also having a separate channel for training and testing is to cater to layers with different training and testing behaviours such as batchnorm and dropout.
76+
creating a layer only created all the `Variables`. To connect the `Variables` into a graph, you can do a `train_fprop(X)` or `test_fprop(X)` to create the tensorflow graph. By abstracting `Variable` creation away from linking the `Variable` nodes into graph prevent the problem of certain tensorflow layers that always reinitialise its weights when it's called, example the [`tf.nn.batch_normalization`](https://www.tensorflow.org/api_docs/python/tf/nn/batch_normalization) layer. Also having a separate channel for training and testing is to cater to layers with different training and testing behaviours such as batchnorm and dropout.
7777

7878
```python
7979
modelb = ModelB()
8080
X_ph = tf.placeholder()
81-
y_train = modelb._train_fprop(X_ph)
82-
y_test = modelb._test_fprop(X_ph)
81+
y_train = modelb.train_fprop(X_ph)
82+
y_test = modelb.test_fprop(X_ph)
8383
```
8484

8585
checkout some well known models in TensorGraph
@@ -102,8 +102,8 @@ hvd.init()
102102
modelb = ModelB()
103103
X_ph = tf.placeholder()
104104
y_ph = tf.placeholder()
105-
y_train = modelb._train_fprop(X_ph)
106-
y_test = modelb._test_fprop(X_ph)
105+
y_train = modelb.train_fprop(X_ph)
106+
y_test = modelb.test_fprop(X_ph)
107107

108108
train_cost = mse(y_train, y_ph)
109109
test_cost = mse(y_test, y_ph)

0 commit comments

Comments
 (0)