Skip to content

Conversation

@weiyangfb
Copy link
Contributor

@weiyangfb weiyangfb commented Aug 10, 2018

@weiyangfb weiyangfb added the ready for review (this tag is deprecated) All PRs are ready for review unless they are draft, WIP, or have undismissed requested changes label Aug 14, 2018

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.


class Dropout2d(_DropoutNd):
r"""Randomly zeroes whole channels of the input tensor.
r"""Randomly zeroes whole channels (a channel is a (N, C) pair) of the input tensor.

This comment was marked as off-topic.

Copy link
Contributor

@li-roy li-roy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good, couple of nits. only commented on dropout3d but applies to all of them

Args:
p: probability of an element to be zeroed. Default: 0.5
training: apply dropout if is True. Defualt: True

This comment was marked as off-topic.

This comment was marked as off-topic.

def dropout3d(input, p=0.5, training=False, inplace=False):
def dropout3d(input, p=0.5, training=True, inplace=False):
r"""
Randomly zeroes whole channels (a channel is a 3D slice of dimensions D, H, W) of the input tensor.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

class Dropout2d(_DropoutNd):
r"""Randomly zeroes whole channels of the input tensor.
The channels to zero-out are randomized on every forward call.
r"""Randomly zero-out entire channels (a channel a 2D feature map of

This comment was marked as off-topic.

r"""During training, randomly zeroes some of the elements of the input
tensor with probability :attr:`p` using samples from a Bernoulli
distribution. The elements to zero are randomized on every forward call.
distribution. ach channel will be zero-out indipendently on every forward

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@weiyangfb
Copy link
Contributor Author

@ssnl is this good to go?

r"""Randomly zeroes whole channels of the input tensor.
The channels to zero are randomized on every forward call.
r"""Randomly zero-out entire channels (a channel is a 3D feature map of
dimensions D, H, W) of the input tensor. Each channel will be zero-out

This comment was marked as off-topic.

The channels to zero-out are randomized on every forward call.
r"""Randomly zero-out entire channels (a channel is a 2D feature map of
dimensions H, W) of the input tensor. Each channel will be zero-out
indipendently on every forward call. with probability :attr:`p` using

This comment was marked as off-topic.

class Dropout2d(_DropoutNd):
r"""Randomly zeroes whole channels of the input tensor.
The channels to zero-out are randomized on every forward call.
r"""Randomly zero-out entire channels (a channel is a 2D feature map of

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Collaborator

@ssnl ssnl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks!

The channels to zero-out are randomized on every forward call.
r"""Randomly zero out entire channels (a channel is a 2D feature map,
e.g., the :math:`j`-th channel of the :math:`i`-th sample in the
batched input is a 2D tensor :math:`input[i, j]`) of the input tensor).

This comment was marked as off-topic.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

petrex pushed a commit to petrex/pytorch that referenced this pull request Sep 6, 2018
* upstream/master: (26 commits)
  cudnn 7 upgrade with spatialBN fix (pytorch#11291)
  Ignore FuseGraph Call on Windows (pytorch#11015)
  defer resolution of mkl to a cmake wrapper library (pytorch#11298)
  Cleanup dependency of distributed flags (pytorch#11221)
  Move minimal wrapdim functionality to core, remove THTensor include i… (pytorch#11283)
  Change includes from ATen/Storage.h to ATen/core/Storage.h (pytorch#11217)
  Fix scalar tensor assert in fusion compiler (pytorch#10952)
  Add dead code elimination pass (pytorch#10101)
  Distributed Data Parallel CPU module for C10D (pytorch#11168)
  Back out "[pt1][tensor] Add strides to caffe2::Tensor"
  Fix conv gradient conversion (pytorch#11312)
  Bag of clang tidy fixes for torch/csrc/ and torch/csrc/autograd (pytorch#11050)
  Sparse tensor printing; add NotImplemented autograd fn (pytorch#10181)
  Add convertToCaffe2Proto to python API
  fix doc for functional.dropout* (pytorch#10417)
  typo fix Tranpose2D -> Transpose2D (pytorch#11281)
  Remove THFinalizer
  Forward declarations of needed curand functions (pytorch#10911)
  nomnigraph - simplify core graph API and test (pytorch#11256)
  Small fixes to cppdocs for sync script (pytorch#11300)
  ...
PenghuiCheng pushed a commit to PenghuiCheng/pytorch that referenced this pull request Sep 11, 2018
Summary:
- fixes pytorch#4177
Pull Request resolved: pytorch#10417

Differential Revision: D9542876

Pulled By: weiyangfb

fbshipit-source-id: 480ed973d1fe0364f4acb5cd596c2031895b82df
@ezyang ezyang added the merged label Jun 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready for review (this tag is deprecated) All PRs are ready for review unless they are draft, WIP, or have undismissed requested changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Better functional.dropout{,2d,3d} docs

5 participants