Downloading large set of files causes iOS memory warning

iOS 8, latest JUCE

My App downloads a large set of files. Over 30 of them totalling 843 mb. Via a for loop, each file is downloaded, unzipped and saved to disk before moving onto the next file. But as it nears the end of the download, I start getting "Received Memory Warning" with each file, until it eventually crashes without finishing the download. 

The MemoryBlock goes out of scope each time through the for loop, so I don't know why the memory allocation is persisting. I even tried calling reset() on the MemoryBlock. Here's my code (adapted from the Introjucer module download code):


            for (//loop through files)
            {
                
                    URL url(www.somepath.com);
                    
                    String destPath = //path
                    
                    in = url.createInputStream (false, nullptr, nullptr, String::empty, 10000);
                    
                    if (in)
                    {
                        MemoryBlock downloaded;
                        in->readIntoMemoryBlock (downloaded);
                        
                        MemoryInputStream input (downloaded, false);
                        ZipFile zip (input);
                            
                        File destFile = destPath;
                            
                        Result result = zip.uncompressTo (destFile, true);
                            
                        if (result.failed())
                            DBG ("Uncompress failed: " + result.getErrorMessage());
        
                        downloaded.reset(); // recently tried this, doesn't help
                         deleteAndZero(in);   
                    }
                    
                    
                }
            }

Any ideas?

So it turns out the memory warnings were coming from a particularly large file near the end of the batch. In this case I'm able to break that file up, and avoid the memory overload. But what about large files like movies, that can't be broken up? Is there a way to download and write parts of the file to disk as the stream comes in?

Well don't try to load it all into memory! Stream it to a file instead!

So just go directly from InputStream to Zip?


in = url.createInputStream (false, nullptr, nullptr, String::empty, 10000);
                    
if (in)
{
     File destFile = //somePath;
                            
     ZipFile zip (in, true);
     Result result = zip.uncompressTo (destFile, false);
                        
}

Seem obvious in retrospect. :) I assumed the MemoryBlock was necessary because it was how the Introjucer handled it.

No - although that'll probably work, it'll be incredibly slow, because the zip code expects to be able to seek in the file, and a URL stream will be very very slow to seek. I meant you should read it into a temporary file, and then use that.