I am trying to understand how the processBlock works in an audio plugin class, I think this is what I need however I am not quite sure how things are working.
What I want to achieve is: from a wav file, I want to process the audio data with my custom algorithm. My algorithm to work, receives small pieces of the file defined by a window size (512,1024 samples,etc) and process each window sequentially until the end of the file. What I have in mind is to fill an AudioSampleBuffer with data from the whole file and then pass the small window frames of samples from the buffer to process, and once finished, the processed data, store it in another buffer.
To see if things are working I would like to get the output buffer filled with the processed data and create a wav file the output buffer samples and hear it or play it back directly, maybe putting it in an AudioFormatReader for example.
Ultimately, the real goal is that I have a few parameters that would be set through the User Interface through a slider or button and I need also to be able to change those parameters in real time, so if I move a slider the parameter will change and the processing of the file will continue the process but with the updated parameter.
Could someone give me a bit of guidance on which path to take to achieve this. If I’m not mistaken I would need to implement some of this in the processBlock of an AudioProcessor? Or perhaps in some other way?
Any help greatly appreciated.