Need some clarification on playing an audio file and using processBlock, AudioBlocks, AudioSourceChannelInfo, and getNextAudioBlock

Hi, I’m still in the learning phase of everything, so hopefully nothing I ask sounds too stupid. But right now in my plugin, I’m looking to just load a file and play it out through the process block. I’ve already watched the tutorials from audio programmer and read the JUCE tutorials, however all these use the getNextAudioBlock function. As I understand the processBlock and getNextAudioBlock functions are pretty directly related (and I think processBlock even calls getNextAudioBlock?) but I feel like I don’t have a good fundamental understanding of what these functions are actually doing, which is hindering me from creating the appropriate setup to play the audio.

I’m currently able to play audio by just copying the values from the AudioBuffer containing my file into the AudioBuffer in the processBlock method, however I feel like there is probably a better way to be doing this. Right now, I don’t know when I should be using things like AudioSourceChannelInfo, AudioTransportSource, AudioFormatReaderSource, or possibly even AudioBlocks. Maybe just copying the AudioBuffer values is acceptable, but either way hopefully someone could clarify on these classes and when I should be using them in a vst plugin. If my question isn’t clear or you have any other questions, just lemme know. Thanks.

In a vst plugin you will be basically using AudioProcessors that will rely on processBlock, so no need to take a look on those others for now (indeed the similarity among those classes is a bit misleading for newcommers).

There are some good examples of playing audio in the Juce tutorials (they use getNextAudioBlock, but it’s basically the same procedure). This one just loops through the audio file, but there’s a tutorial solely for playing files aswell.