Skip to content

Error loading gzipped weights #6002

@MortenHannemose

Description

@MortenHannemose

I'm trying to compress the weights of a network using gzip. Here is a MWE:

import torch, shutil, gzip
import torchvision.models as models
resnet18 = models.resnet18()
torch.save(resnet18.state_dict(), 'test.pt')
with open('test.pt', 'rb') as f_in, gzip.open('test.pt.gz', 'wb') as f_out:
    shutil.copyfileobj(f_in, f_out)
    #f_out.write(f_in.read())
    
with gzip.open('test.pt.gz', 'rb') as f:
    state_dict = torch.load(f)

When loading the compressed file I get the error:

  File "/home/user/.virtualenvs/py3/lib/python3.5/site-packages/torch/serialization.py", line 267, in load
    return _load(f, map_location, pickle_module)
  File "/home/user/.virtualenvs/py3/lib/python3.5/site-packages/torch/serialization.py", line 428, in _load
    deserialized_objects[key]._set_from_file(f, offset)
RuntimeError: storage has wrong size: expected -772918636240159923 got 64

Even if I create the file using f_out.write(f_in.read()) I still get the same error. The test.pt is identical to test.pt.gz when I compare them using f.read(). It works if I open the uncompressed file:

with open('test.pt', 'rb') as f:
    state_dict = torch.load(f)

The expected size is sometimes negative, which leads me to believe it could be some sort of underflow, but it also changes each time it's run, so it must also be related to the randomly generated weights.

I'm using:

  • OS: ubuntu 16.04
  • PyTorch version: 0.3.1
  • Installed via: conda
  • Python version: 3.5.2

Metadata

Metadata

Assignees

Labels

todoNot as important as medium or high priority tasks, but we will work on these.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions