i’m new to Juce and working on porting an instrument from a higher-level language to C++ using Juce. I’m making a lot of progress on the UI side and in utilizing the synthesiser class – finding Juce to be a fantastic set of tools so far. However i’m a bit lost now that I’m digging into the guts of passing audio around.
lets say I want to take the Juce demo’s simple synth as a starting point. i want to add a Juce IIR filter and a Juce Reverb in series after it’s output. what is the best way to go about doing this? I’m afraid i don’t have much intuition for how to work with audio buffers yet, so I’m hoping some of you can take mercy on a noob and give me a few pointers for how to break this simple problem down.
You could take a look at the AudioProcessorGraph class. That will let you create a signal graph of any number of AudioProcessors. They can then be routed any which way you like. It’s a very intuitive system and like most things with this library, it works really well
ah, ok. I guess i got the idea that AudioProcessor/AudioProcessorGraph were specific to hosting plug-ins, not more general purpose.
So if I wanted to say have a lowpass filter effect, would i do something like make a new class called LPFilter and have that inherit AudioProcessor and IIRFilter, then give an LPFilter to my AudioProcessor graph? am i getting warm?
Depending on what you want to do, you can also go to the processBlock function of your synth, and process directly the output buffer with the functions of the classes IIRFilter and Reverb from JUCE.
Depending on what your goal is, both solutions provided are correct.
The graph approach is more complex (but not that much) , but more powerful.
yes, i think the graph approach is more what i’m looking for in the long run, so i’ll give it a shot. thanks for the suggestions everyone…
The great thing about the graph approach is that each node can very easily be a standalone plugin too. So if you create some really nice high level effects, you can also share the lower level processors as plugins too. So when a friend asks about the reverb you wrote for synth X, you can just give them the plugin containing that node to try for themselves. Nice.
p.s. it took me a while to realise the potential of Jules graph based system. I’m still only learning to put the pieces together, but it seems really powerful. I’ve yet to try feeding the output of one node back into an earlier node to set up a feedback loop, but I’m guessing that’s possible.
Ok, after reading through the docs again, I’ve got a sense of how everything fits together but i’m still in the weeds a bit.
So now i’ve created an AudioProcessorGraph, and I’m trying to start by just adding two nodes to the graph, an AudioGraphIOProcessor to take care of the input and output of the graph. I’m trying to add an input like this:
graphInput = new AudioProcessorGraph::AudioGraphIOProcessor(audioInputNode);
But that doesn’t do the trick, as audioInputNode is undefined. But according to the docs I need to specify an io device type. What am I missing?
As I mentioned, i’m still getting my head around C++ in general and how to create and manage objects, so be kind
new AudioProcessorGraph::AudioGraphIOProcessor (AudioProcessorGraph::AudioGraphIOProcessor::audioInputNode);