[quote=“fabian, post:9, topic:17280, full:true”]
Yeah, you can’t set kAudioUnitProperty_MaximumFramesPerSlice - I tried that as well. I think the current JUCE code is correct.[/quote]
I’d say it’s not correct:
The value that kAudioUnitProperty_MaximumFramesPerSlice returns changes right after calling AudioUnitInitialize. The current code is asking for an uninitialized value!
As I mentioned, the audioDeviceIOCallback documentation already says that you should “make sure your code can cope with reasonable changes in the buffer size from one callback to the next”. The value returned by currentBufferSizeSamples should be the common case, in this case when AVAudioSession says, NOT the worst case.
When talking about total latency, the latency returned by the latency getters in AVAudioSession needs to be taken into account in addition to the latency that is caused by buffering. The buffering always adds latency on top of that - the bigger the buffer size, the longer the latency.
For me it seems to change only if I’ve tried setting it before. But I agree it would be safer after the initialize.
Yes, I’ve meant to bug Jules about this documentation. It’s the same for prepareToPlay method which we have recently updated to be more specific, i.e. that it’s almost always the maximum buffer size - not the common case.
I assumed the latency returned by AVAudioSession would take at least the HAL buffer into account. Need to recheck this… however, there is also [AVAudioSession sharedInstance].IOBufferDuration. I always assumed that this would return the “common” buffer size as this is always smaller than kAudioUnitProperty_MaximumFramesPerSlice. I guess we need a way to report the IOBufferDuration value tp the user.
Any news on this? I still strongly believe that the block size should be the default case, and the maximum size of intermittent larger chunks may be reported by some other function, but don’t need to, as per the documentation of audioDeviceIOCallback.
I’ve looked into it and come to the conclusion that this is a Juce bug. If you do this on iOS:
AudioDeviceManager::AudioDeviceSetup desired;
desired.bufferSize = 256; // Or any number you want
audioDeviceManager.setAudioDeviceSetup(desired);
AudioDeviceManager::AudioDeviceSetup actual;
audioDeviceManager.getAudioDeviceSetup(actual);
Then actual.bufferSize is not what you’ve just set. It always ends up whatever the value of kAudioUnitProperty_MaximumFramesPerSlice is (1156 for me). This is not because of iOS: setPreferredIOBufferDuration accepts 256. This is what I’ve observed:
setAudioDeviceSetup calls iOSAudioIODevice::open, which calls updateCurrentBufferSize, which sets the buffer size on the AVAudioSession successfully. So far so good.
iOSAudioIODevice::open calls handleRouteChange, which calls createAudioUnit.
It creates a new audio unit, but doesn’t set the kAudioUnitProperty_MaximumFramesPerSlice property. The Apple Documentation tells you that you should configure this property:
Your application should always configure and honor this property.
actualBufferSize is set to the value of the value of the kAudioUnitProperty_MaximumFramesPerSlice property. Here, whatever the user has set before gets overwritten.
setAudioDeviceSetup sets currentSetup.bufferSize to the value that has just been overwritten.
The part that sets the latency using setPreferredIOBufferDuration is correct. It works in Juce and in other iOS apps. However, storing the MaximumFramesPerSlice in the actualBufferSize is wrong, if you haven’t set the property to actualBufferSize before. So I think this is a Juce bug.
So the value isn’t 1156 because setPreferredIOBufferDuration decided to use something else than 256. It’s that value because it gets assigned by Juce.
Just a head-ups, I’ve changed the iOS block size code once again. I now probe the block size after AudioUnitInitialize as suggested above. This seems to work much better.
I’ve encountered the same issue with the latency but the fix didn’t work for me completely.
I’m using 4.2.1 version of Juce. I initialize the audiodevice using a XML backup of the audiodevice status. When I store the audiodevice status, the audio buffer size is set to 1156 instead of the buffer size I choose (consistant with this thread issue).
I first switched to the developper branch to get your fix. Once build and started my app, I had no sound at all. I had to change the buffersize manualy in my software. Afterwards the app worked fine. I stopped and started the App again, the buffersize displayed is the one stored in the XML file and also the one I choosed but still no sound. I have to force another buffersize.
I switched back to the standard 4.2.1 version and inserted the modification of juce_ios_Audio.cpp in the createAudioUnit method. I got the same behavior.
With 4.2.0 until the current development branch I’ve no audio output on iOS with my old code (4.1.0). It only works if another app is using the audio card too. Whats wrong?
Edit: But setting block size and sample rate is working now
The missing audio output could be a performance problem, not sure for now. Have just old test devices here and my synth eats a lot.
But, there is something more wrong.
With a sample rate of 44kHz and a blocksize of 1024 samples I get a callback for 1024 samples after about 21ms - if I change the sample rate to 22k then I get the callback for 1024 samples after the same time - but it should be about 42ms. So it seems not really change the sample rate - block size seems ok.
Please take a look on this - I’m “here” for testing if you need.
Same sample rate problems with the JUCE demo app on iOS (iOS 9) - reducing the sample rate just pitches the test sound in the audio device settings - but does not change the sample rate.
Hi Fabian, maybe this will help - maybe not. If I use Hanley’s modified ios audio files for AB then it works fine to set the sample rate and blocksize.
PS.: just replace (and add a & before)
“nbsp;” with " "
“lt;” with “<”
“gt;” with “>”
“amp;” with “&”
Somebody should fix the forum code block merge bugs - its a pain to use it - cost a lot of time - and it is not fair to the people who spend a lot of time to post and generate it.
Additionally are a lot of links in the forum broken - also not good and should be fixed.
In that particular post it’s not a display issue with the forum - that’s the text that he actually pasted into his post! I guess his browser/text editor/whatever mangled the   stuff but it’s not something that the forum is doing wrong, and we could only fix it by manually unmangling his text.
In that thread are all code blocks broken - and I mean it was looking OK in the old forum. So I was thinking that this was a merge error and maybe fixable.