I've copied and modified AudioProcessorPlayer (APP) in order to make a sequencer capable of routing realtime and sequenced events and audio through an audio processor graph. However, IIUC, the audio processor graph is only able to pass on midi events from the parent of the audio graph (in my case, the APP), so that a AudioProcessorGraph::AudioGraphIOProcessor::midiInputNode only accepts input at the global resolution, i.e. I can't split incoming MIDI events by hardware source.
Am I correct that I will need to "roll my own" if I want to do this? Ideally, I'd like to be able to connect some VST to, say, "MPK Mini", and another VST to "Babyface MIDI In", and the beams wouldn't cross, so to speak.
I should end by saying -- as always -- juce rocks.