Skip to content

Conversation

@alvgaona
Copy link
Contributor

@alvgaona alvgaona commented Jul 16, 2020

This PR implements a feature extension discussed in #41516.

I followed this other PR #22245 to add this other module. While I was at it, I also added extra_repr() method in Flatten which was missing.

I see there are no unit tests for these modules. Should I add those too? If so, what is the best place I should place these?

@alvgaona alvgaona requested a review from apaszke as a code owner July 16, 2020 21:57
@alvgaona
Copy link
Contributor Author

alvgaona commented Jul 16, 2020

@albanD @ezyang would you add yourselves as reviewers? I can't edit those.

@albanD albanD self-requested a review July 17, 2020 13:10
@albanD albanD added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Jul 17, 2020
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi,

Thanks for the PR.

I wasn't aware actually that Tensor.unflaten only works with named dimensions. My bad.
This is a bit unsatisfying here are the nn.Unflatten() layer wouldn't be able to do the opposite of nn.Flatten() in many cases (only NCHW inputs would work).

I wonder if we don't want the user to provide dim and unflattened_size as ints. So that we can get the same general API as unflatten but we don't have to recreate named Tensors in the middle.

One thing I could see as well is to accept as input, either

  • dim as string and the second argument is a namedshape.
  • dim is a int and the second argument is a tuple of int or torch.Size

@alvgaona
Copy link
Contributor Author

alvgaona commented Jul 17, 2020

Thanks for your feedback. No worries about that, what you've said is very reasonable.

So you're saying that you'd like something like this?

    def __init__(self, dim: int, unflattened_size: Size) -> None:
        super(Unflatten, self).__init__()
        self.dim = dim
        self.unflattened_size = unflattened_size

    def forward(self, input: Tensor) -> Tensor:
        return input.unflatten(
            self.dim,
            (('C', self.unflattened_size[0]), ('H', self.unflattened_size[1]), ('W', self.unflattened_size[2]))
        )

Where dim is the dimension to unflatten, and unflattened_size, the shape of the unflattened dimension.

>>> s = torch.nn.Sequential(torch.nn.Unflatten(1,torch.Size([3,128,128])))
>>> t = torch.randn(32,49152)
>>> s(t).size()
torch.Size([32, 3, 128, 128])

@albanD
Copy link
Collaborator

albanD commented Jul 17, 2020

So you're saying that you'd like something like this?

For the arguments yes. But your forward here would only support specific inputs.
Also this won't allow the user to specify their own named dimensions if they want to use it.

What about something like:

class UnFlatten(nn.Module)
    def __init__(self, dim: Union[int,str], unflattened_size: Union[tuple,Size]) -> None:
        super(Unflatten, self).__init__()
        if isinstance(dim, int):
            self.named = False
            # Make sure the unflattened_size is a tuple of ints
        else:
            self.named = True
            # Make sure the unflattened_size is a tuple of tuple

        self.dim = dim
        self.unflattened_size = unflattened_size

    def forward(self, input: Tensor) -> Tensor:
        if self.named:
            return input.unflatten(self.dim, self.unflattened_size)
        else:
            dim = self.dim
            if self.dim < 0:
                dim += input.ndim()
            inp_size = list(input.size())
            new_size = inp_size[:dim] + self.unflattened_size + inp_size[dim+1:]
            return input.view(new_size)

Also can you add a simple test in test/test_nn.py that makes sure that we get errors when wrong inputs are provided and that it does what we expect.

@ezyang ezyang requested a review from zou3519 July 17, 2020 22:09
@dr-ci
Copy link

dr-ci bot commented Jul 17, 2020

💊 CI failures summary and remediations

As of commit 3854ba7 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 33 times.

@alvgaona alvgaona requested a review from albanD July 18, 2020 16:05
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good. Just small comments for the doc/test.

@alvgaona
Copy link
Contributor Author

I think you need to update this part of the doc to mention the two ways to call this.
The two examples below are very good!

Oh right, yes.

nit: could you make the input Tensor smaller here? Just to make sure we don't slow down the tests for no reasons.

Of course. Absolutely.


class Unflatten(Module):
r"""
Unflattens a tensor into another tensor of a desired shape. For use with :class:`~nn.Sequential`.
Copy link
Contributor Author

@alvgaona alvgaona Jul 20, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albanD I think this should cover all possible ways to call this module. 👍

Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect! Thanks for making the change.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@alvgaona
Copy link
Contributor Author

What should we do with that check that failed @albanD?

@albanD
Copy link
Collaborator

albanD commented Jul 21, 2020

This is just flaky internal tests :'( I'll take care of it!

@facebook-github-bot
Copy link
Contributor

@albanD merged this pull request in c89c294.

facebook-github-bot pushed a commit that referenced this pull request Jul 23, 2020
Summary:
I'd like to amend the docstring introduced in #41564. It's not rendering correctly on the web, and this should fix it.

cc albanD

Pull Request resolved: #41835

Reviewed By: izdeby

Differential Revision: D22672368

Pulled By: albanD

fbshipit-source-id: f0b03c2b2a4c79b790d54f7c8f2ae28ef9d76a75
@alvgaona alvgaona deleted the unflatten-module branch July 23, 2020 17:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants