Problem writing large WAV file

While trying to write a large WAV file, I found a bug in WavAudioFormatWriter::write

I have seen the problem on MacOS and on Windows

Here is the input to the method : numChannels = 2, numSamples = 67110000, bitsPerSample = 32

the following statement gives the wrong answer because there is an overflow:

const size_t bytes = numChannels * (unsigned int) numSamples * bitsPerSample / 8;
bytes = 9088

The solution I propose:

const size_t bytes = ( size_t )numChannels * numSamples * bitsPerSample / 8 ;
bytes = 536 880 000

Is the solution valid ?
Can this problem happens with other file format ?


WavAudioFormatWriter code:

bool write (const int** data, int numSamples) override
    jassert (numSamples >= 0);
    jassert (data != nullptr && *data != nullptr); // the input must contain at least one channel!

    if (writeFailed)
        return false;

    const size_t bytesOverflow = numChannels * (unsigned int)numSamples * bitsPerSample / 8;
    tempBlock.ensureSize (bytes, false);`

Use a WAV RF64 file


Jon, thank you for the feedback. Your solution is valid but I still think there is a bug in juce that should be corrected.

Let’s see in detail the overflow and how to prevent it.

const size_t bytes = numChannels * (unsigned int) numSamples * bitsPerSample / 8;
= 32 * 67110000 * 32 / 8
= 4295040000 / 8
=> overflow because 4295040000 is greater than 2^32 (4294967296)

To correct it we need to cast in size_t (unsigned int64)
const size_t bytes = ( size_t )numChannels * numSamples * bitsPerSample / 8 ;


And where is this going to be saved inside the regular WAV RIFF format which only gives you 4 bytes?


@Rail_Jon_Rogut The WAV format writer actually takes care of this for you - if you write more than 2^32 bytes it should automatically switch into RF64 mode without needing to do anything.

@flacours Sure, that expression would overflow if given a huge number, but numSamples isn’t the length of the whole file, it’s just the length of the single block of data that you’re appending to it… i.e. a block that will typically never exceed a few KB! I guess that if you were trying to write a huge block of data from a memory-mapped file then it could be a problem so I’ll tweak it to cope with that, but it’d be a very very obscure edge-case use of the class!

…actually, just looking at the method, it’d be insane to call that method with a huge block of data, because it has to be copied and re-structured in a temp memory block before being written, so if you called it with 2^32 samples, it’d be allocating and copying gigabytes each time!

@jules I agree with you the way we use the method is inefficient . I will correct that.

Thank you !