I have two questions. First of all, what are the main background threads of an audio plugin application? I suppose that the AudioProcessor is running in one thread, and the AudioProcessorEditor is running in another? How are they being handled internally?
Now suppose that I want to do heavy rendering and real-time image processing for visualization, how can I do it without affecting the sharpness of the DAW’s UI? Currently, I’m doing it within the AudioProcessorEditor paint method, using a Timer to repaint the UI (I have tried different refresh rates). However, I can’t seem to get a smooth result no matter what I do, and most importantly, the smoothness of the DAW is also affected. I have tried to make the computations within the audio thread instead (AudioProcessor), but then I have synchronization issues with the data being computed and rendered at the same time (for instance, left and right channel not looking the same even in mono), and it takes audio CPU cycles. Is there a general approach when generating heavy visualization? Should I simply do it in another thread and implement my own synchronization mechanism? Thanks.