Unpacking huge file contents from zip/tar archives and system memory use #202
Labels
bug
Something isn't working
enhancement
New feature or request
wishlist
Tasks or features of lowest priority
Milestone
The helper function to unpack zip/tar archives makes an unnecessary copy of the individual archive file contents to memory in
https://github.com/ucphhpc/migrid-sync/blob/edge/mig/shared/archives.py#L132
and therefore may hit an OOM during unpack if any of the files exceed the available system memory in size.
The helper takes that step-wise approach to ease error and out of bounds detection, but should be optimized to stream the reading and writing directly to the destination file without reading the complete file into memory first.
For the time being users will have to upload such huge files directly on SFTP or similar as a workaround.
The text was updated successfully, but these errors were encountered: