Best practice for timing (Audio thread)

Hello there,

as I am learning Juce and having a lot of fun and while every answered question brings up 10 new ones, I was wondering what the best practice for timing would be.

Timing in this case means for example:

  • step sequencer / arp or anything else generating midi output
  • adsrs and similiar
  • basically everything that would go into the audio thread and not the ui

I am not quite getting the grasp of what to to when/where. How do I pre render and make sure its played at the right time. What about realtime like arps ? Do I get the timing from a midi clock or rather from the sample rate and just count?

Also it would be intersting to know what to do if the ui has to reflect the timing (like in a step sequencer)

I checked some of the tutorial projects and the documentation, there is a lot htere to reverse engineer, but there is no real best practice.

So many questions…


In the audio thread you should just count in samples yourself. (This could become problematic if it is a MIDI only plugin and the length of the incoming audio buffer is set to zero. I am not sure if that is a common situation with DAW applications and what should be done in that case. edit : This was recently discussed here in the forum. Looks like that MIDI-only plugins are not currently supported at all for VST3 in JUCE.)

There is no straightforward best solution for the GUI updates. The usual way for custom components to do it is to have a timer in the GUI that polls the audio processor’s state and repaints as needed. This does not necessarily result in perfect sync between the audio and the GUI, but it’s the easiest way to implement it and works well enough in many cases.

1 Like

In this thread:

I try to do timing based on sample counting using for a mdidi plugin. By default this has no audio channels and so the buffer size seems to be 0.

This is a good example on why I am asking for best practice…

Like I mentioned above, if it’s a VST3 MIDI only plugin, it just isn’t going to work properly in JUCE. If it isn’t a VST3 MIDI only plugin, did you actually check the AudioBuffer given to the plugin has a length 0? AudioBuffer allows for the special case where the channel count is 0 but there is still a length.

AudioBuffer<float> testbuf(0, 512);
jassert(testbuf.getNumSamples() == 512);

The assert isn’t hit during debugging.


was actually 0 because there were no audio channels. Still this was just an example for why I would love to get some best practice advice.

If you really get buffers of length zero (not just buffers where the channel count is zero), I don’t think there’s much you can do properly. Maybe you could see what the buffer length given in prepareToPlay is and make your processBlock work based on that, but that likely isn’t going to work in all situations. You will likely get a length of zero there too. But you should test it, just in case…