Weird error when loading 242 samples


I encountered a pretty weird bug on OSX el Capitan.

If I load > 242 samples at once, the audio format manager fails to create a memory mapped reader.

This happens only in Standalone mode. As AU / VST plugin it loads correctly. Also if I attach Xcode to it , everything is fine.

The loading operation is happening in a ThreadWithProgressWindow (which is also the crashing thread according to the crash report)

i can think of two possible culprits:

1. Some weird file permission mixup (so that the standalone version does not have the necessary rights to load multiple files)

2. Some problems with the modality of the progress window.

Has anyone some experiences with this kind of issues?


unitialized values? Do you have stacktrace, which kind of error?


The problem is not a crash - createReaderFor() returns a nullptr and I can avoid crashing by checking the return value - but I can't imagine why it does this because the file clearly exists and is a valid WAV file - I even have a fallback solution so that a standard AudioFormatReader is created in case that the MemoryMappedAudioReader doesn't work. Also it doesn't matter which files I load. If I load any 241 samples, everything is fine. 242 samples => createReaderFor() returns a nullptr and hell breaks loose.

And I checked again for uninitialized variables, this isn't the case.



Surely you can step into the code and see how far it gets in trying to create the reader?


It throws an assertion in FileInputStream::read():


The file is a valid path but the FileHandle is a nullptr.

Also this happens only if I start the Debug build and attach Xcode to the process. If I run the Debug build from within Xcode, everything is fine.

The result member of FileInputStream sheds some light into the darkness: it contains a error message ("Too many open files"). But why is this a problem, I am used to have thousands of samples loaded?


Your OS has a limit called file descriptor limit.It is there for security reasons and to make sure no user blocks all resources. You find a lot of information searching the net for that term. Each open file uses one file descriptor. The file descriptor limit can be set per user or per process and system wide.

The problem will go away, if you close your open fies from time to time. It is not possible to have an unlimited ammount of files open at a time. However you can set the limits on your machine arbitrarily high (probably at least 65535, haven't checked), but that should not be the solution to solve a design problem...

​launchctl limit maxfiles

Yields 256 allowed open file handles per process. Is this a default value or is my Macbook stupid? 256 open files per process seems extremely low...


Alright, calling

​sudo launchctl limit maxfiles 1000000 1000000

fixes the problem on my computer. Now back to the design level:

For every sample I load, I create a MemoryMappedFileReader as member variable. It will be used to load the sample subsequently in a background thread.

There is virtually no way how to stay within the 256 open files per process limit. There may be even the case that >256 voices are playing simultaneously and creating a MemoryMappedFileReader for every read process (it is buffered with 8192 samples) is out of the question because of unecessary perfomance penalty.


Also if I am running as plugin (and as Standalone from within XCode), the max file handle amount is bypassed. Is there an OS call that defines the max file access for a process that I could eg. call from my constructor?

struct rlimit FileSizeLimit;

FileSizeLimit.rlim_cur = 200000;
FileSizeLimit.rlim_max = 200000;

setrlimit(RLIMIT_NOFILE, &FileSizeLimit);

After a bit of googling I found the necessary system call, but my "stupid hack"-meter is off the charts. I am wondering why nobody else ever ran into this problem - I surely can't be the first idiot trying to load more than 256 files at once...


I think you're trying to solve the problem with the wrong measures. Just to have a file handle open doesn't mean, that you can start reading from that location in no time. You still depend on the speed of your disk and risk to block your audio thread waiting for a file to read.

You will need to design something, that has all resources in the memory before you can hand it to the audio thread.

And yes, I think a lot of development hours or even years have gone into this problem, but I doubt that someone uses this aproach in productive code...


Of course I am not reading from the disk in the audio thread. There is a thread pool that prebuffers the files which is a must for having large sample libraries (can't load them all into memory). However this worker thread reads chunks of 8192 samples (before they get accessed by the audio thread) and creating a new AudioFormatReader for every read operation comes with unnecessary overhead.

RE Production code I checked some other audio software and Ableton Live seems to use open file handles for their Sampler. On the other hand KONTAKT only opens a file handle if it actually streams the sample so there is not a definitive way to do this.


I really would like to gather more information and best practices regarding file handles (because of my OSX noobness)

Jules you mentioned somewhere you are using the MemoryMappedAudioFormatReader in Tracktion, but can you recommend keeping the file handles open (which I assume is the same thing as creating a reader object with the same lifetime as the actual sample object).

If yes, did you use that weird system call to change the max allowed file number? Having more than 256 audio files in a DAW project is definitely no edge case.


Do not open the file each time you read, but have a notion of session per file in order to open and close it accordingly.

Like the fils is being read by the audio thread.

You can use some kind of lazy allocation of this reader using some acquire/release of this file.

Still you probably will need to increase the limit on OSX if you plan to stream more than 256 file at the same time.


We use some quite complex buffering in tracktion.. We've never hit a problem with this, but juce should already increase your rlimit in any OSX app - have a look at, line 96.


Hmm weird I had to call setrlimit manually to "fix" the bug. What are possible reasons for it being not called?


No idea - maybe put a breakpoint in there and see if it gets hit?


The thing is that if I debug it, it runs within Xcode and inherites its process max file settings.

But when I add these faboulous lines into RLimitInitialiser()

​float *x = nullptr;
x[2] = 3.0f;

It crashes - poor mans debugging at its best :)

The problem is the RLIM_INFINITY value. If I use another large number. setrlimit has its desired effect.

RLIM_INFINITY is defined as

​(((__uint64_t)1 << 63) - 1)

__uin64_t is a typedef for unsigned long long on my system. So far there is nothing wrong with it, but I start getting the feeling something is extremely awkward here.




Don't really know, but it could indeed be that inheriting Xcode's settings is what's messing it up..


But the problem occurs outside of Xcode, so it can't be the culprit...


You need to check the return value of setrlimit.

The current juce code not always work.

I had to do this in my code


class FileLimitInitializer
  FileLimitInitializer() {
    struct rlimit oldlim;
    oldlim.rlim_cur = oldlim.rlim_max = 0;
    int err = getrlimit(RLIMIT_NOFILE, &oldlim);
    if (err == 0)
      if (oldlim.rlim_cur == RLIM_INFINITY && oldlim.rlim_max == RLIM_INFINITY)
      oldlim.rlim_cur = oldlim.rlim_max = 0;
    struct rlimit lim;
    lim.rlim_cur = lim.rlim_max = RLIM_INFINITY;
    err = setrlimit(RLIMIT_NOFILE, &lim);
    if (err != 0)
      size_t i = 1;
        lim.rlim_cur = 800*i;
        err = 0;
        if (oldlim.rlim_cur < lim.rlim_cur)
          err = setrlimit(RLIMIT_NOFILE, &lim);
      while (err == 0 || i == 20);