Skip to content

Conversation

@ssnl
Copy link
Collaborator

@ssnl ssnl commented Apr 11, 2018

Addresses #6516 .

@ssnl ssnl force-pushed the fft_multidim_batch branch from 2ee44ab to b0bd00b Compare April 11, 2018 21:48
ss << "Expected an input tensor of floating types, but got input "
ss << "Expected an input tensor of floating types, but got input="
<< self.type() << self.sizes();
throw std::runtime_error(ss.str());

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@zou3519 zou3519 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm. Is it ever useful to support broadcasting of these batch dimensions?

@soumith soumith merged commit 8aa0ae3 into pytorch:master Apr 12, 2018
@ssnl ssnl deleted the fft_multidim_batch branch April 12, 2018 19:03
@ssnl
Copy link
Collaborator Author

ssnl commented Apr 12, 2018

Input is just a single tensor, so I don't think that broadcasting is involved here :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants