Skip to content

Conversation

@nikithauc
Copy link
Contributor

fixes #452

Problem -
*StreamUpload.sliceFile function reads the first n bytes from the Stream.

  • If an upload fails after the 0 - n is read from the stream, then on upload resume the sliceFile function reads from n+1 bytes and 0-n is never uploaded.

Solution -

  • Added ChunkRecord interface which keeps track of the previous slice and range.
  • Compare the new upload range with the previous range and read the bytes accordingly.

@nikithauc nikithauc merged commit 710de2d into dev Jun 4, 2021
@nikithauc nikithauc mentioned this pull request Jun 8, 2021
@nikithauc nikithauc deleted the bugfix/stream-resume branch September 28, 2022 23:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Large File Upload - retry doesn't work for StreamUpload

3 participants