AudioSource vs AudioProcessor

I am just wondering what the benefits/differences are of an AudioSource vs an AudioProcessor. I understand that the AudioProcessor is useful for Plugins and being connected to a gui but not sure where the AudioSource might be useful.

I am developing a plugin that would chain various AudioProcessors together. Each source AudioProcesor has inside it a Synthesiser that produces sound. The tutorial shows this coming from an AudioSource class but I from what I can tell putting the AudioSource in the AudioProcessor may be an unnecessary indirection accomplish the task.

So just wondering if there is some benefit I am not seeing that the AudioSource would provide

There’s likely no reason why you would want to use AudioSource these days, unless you are actually working with the AudioSource based classes like AudioTransportSource. I am not completely sure what’s the purpose of the AudioSource in the Juce synth tutorial.

AudioSource based classes are quite useful for chaining processing stages, for example if you want to play back a file you might use AudioReaderSource followed by ResamplingSource, and it’s quite simple to create your own. Of course you can also do that with AudioProcessor but AudioSource is more lightweight and only deals with audio.