Beginner in Juce, I am currently trying to make an harmonizer, using the library SoundTouch.
In order to do it I’m combining what I learned in the Juce tutorials and the code of a working, open source harmonizer JUCE project that uses SoundTouch.
I took as a template the project for simple audio processing from JUCE tutorials as a base and I try to transform the outbuffer with a pitch shift from the Soundtouch library instead of simply adding white noise.
One problem I have is that my project uses a getNextAudioBlock function, while that other project uses a processBlock function. I struggle to understand exactly what differentiates the 2, and if I need the former, the latter, or both.
My hunch right now is that I either need one or the other but not both, and that processBlock somehow calls getNextAudioBlock implicitely (as apparently getNextAudioBlock is a callback function that the systems calls itself to get his load of samples to feed to the audio hardware, and as such seems always it’s always needed in an audio app). But I’d like it if someone could explain precisely the difference beetween both.
The second problem is that the other project uses the function getSampleData on a buffer, and that this function is deprecated now. But if I understood well, the equivalent is to use getWritePointer on a buffer I want to write, and/or getReadPointer on a buffer i want to read. Would like a confirmation on that too.
Thanks for your help