Skip to content

[PyTorch] random_ on floating type with negative values is broken on both CPU and GPU #6338

@ssnl

Description

@ssnl

Reproduced on master and also broken on 0.3.1.

>>> torch.randn(3,3).random_(-1,10).max().item()
9.0
>>> torch.randn(3,3).random_(-1,10).max().item()
1.8446744073709552e+19
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
4294967296.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
8.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
4294967296.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
4294967296.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
9.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
9.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
8.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
8.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
8.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
4294967296.0
>>> torch.randn(3,3).cuda().random_(-1,10).max().item()
4294967296.0

Reported in #6136

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions