This is a nitpick, and I’m not even using the routines, but in the process of analyzing some Juce audio code, it seems that the application of gain ramps (important, to prevent discontinuities in the output waveform caused by time-varying parameters) is sensitive to the buffer size.
Specifically, the duration of the ramp is sensitive to AudioDevice::getCurrentBufferSizeSamples() because the fade is always performed over numSamples in getNextAudioBlock. If the buffer is large the ramp will take longer and any user interface control tied to the gain will have some perceptible lag. For example, a DirectSound 2560 sample buffer at 44,100 equals a 5.8ms fade, but a 64 sample ASIO buffer at 44,100 is 1.4ms. Huge difference!
Just my opinion, but any object that exposes a gain feature, should also have a gainRampMilliseconds parameter or constant, and the number of samples over which the new gain is faded in would be calculated based on the sample rate and this value of gainRampMilliseconds. Depending on the buffer size and the duration of the gain ramp, it might be necessary for code to support ramping the gain across multiple calls to getNextAudioBlock().
Like I said, just a nitpick, and certainly not something that I am asking to be changed, but it was worth pointing out.