-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
Description
I believe it would be a good addition to add a new factory function, torch.astensor, which is equivalent to torch.tensor, but which doesn't perform a copy if possible. This means that passing a torch.Tensor returns a view of the same tensor, and passing a numpy array would have a behavior similar to torch.from_numpy.
This means that we could potentially have a call on torch.astensor in the beginning of every function, as a way of supporting other data types than torch tensors for torch operations.
For example
# in torch namespace
def exp(x):
x = torch.astensor(x)
return x.exp()
# can now call `exp` on floats, lists, arrays
torch.exp(1.0)
torch.exp([1.0, 2.0])
torch.exp(np.array([1.0, 2.0]))Given that torch.tensor infers the type of the tensor from the content of the data we pass to it, we could then probably deprecate torch.from_numpy in favor if this unified constructor.
What do you think?
elanmart, jekbradbury, crcrpar, mathmanu, mingdachen and 2 more