Hi Jules,
when we use juce::File::loadFileAsData() to load big data, we noticed that it's much slower in version 3 than in 1.5. Turns out that the issue is in the class juce::MemoryOutputStream:
juce::File::loadFileAsData() calls juce::InputStream::readIntoMemoryBlock() with numBytes set to -1:
...
return in.openedOk() && getSize() == (int64) in.readIntoMemoryBlock (destBlock);
}
which calls juce::MemoryOutputStream::writeFromInputStream(), again with maxNumBytesToWrite set to -1:
...
MemoryOutputStream mo (block, true);
return (size_t) mo.writeFromInputStream (*this, numBytes);
}
which then determines the correct amount of bytes to read, but doesn't pre-allocate it, because maxNumBytesToWrite is set to -1:
int64 MemoryOutputStream::writeFromInputStream (InputStream& source, int64 maxNumBytesToWrite)
{
// before writing from an input, see if we can preallocate to make it more efficient..
int64 availableData = source.getTotalLength() - source.getPosition();
if (availableData > 0)
{
if (maxNumBytesToWrite > availableData)
maxNumBytesToWrite = availableData;
if (blockToUse != nullptr)
preallocate (blockToUse->getSize() + (size_t) maxNumBytesToWrite);
}
return OutputStream::writeFromInputStream (source, maxNumBytesToWrite);
}
Changing
if (maxNumBytesToWrite > availableData)
to
if ((maxNumBytesToWrite > availableData) || (maxNumBytesToWrite < 0))
solves the issue.
Cheers,
Marcus
