Midi Plugin using MidiOutput?

I am developing a chord progression plugin. By pushing a button, you can send midi to another (VST) instrument. So this plugin creates midi data and immediately sends it out (to the VST host).
Am I supposed to use the MidiOutput class for this plugin? If so, what would the output device be?

2 Likes

If it’s a MIDI effect (i.e. it interacts with the DAW like a “normal” plugin and sends MIDI out in the usual way) you should just add the MIDI events to the MidiBuffer that is passed to AudioProcessor::processBlock.

thanks reuk, so even when there is no midi input, processBlock is where I can add my own midi events?
Sounds good :+1:

Is a correct value of samplePos relevant in the statement below, i.e. when adding midi messages to the buffer (in processBlock). I’ve seen a couple of examples where samplePos is not initialized, or used in any way.

processedBuffer.addEvent(message, samplePos);

I am getting good midi timing behaviour when running my plugin as a standalone app, but have a big midi lag when running it as a plugin (cubase).

samplePos is relevant to position the MiDI message in the buffer. It’s true that leaving it to 0 will be OK in most cases: the message will be positioned at the buffer start, so at worse it will be bufferSize samples early. But imo you should always use it when relevant (not sure it is in your case).

Regarding latency when used as a plugin: what is the buffer size you are using in Cubase? Anything above 512 will result.in audible latency when playing/recording, audio or MIDI.

This is how processBlock looks like:

void MySynthAudioProcessor::processBlock (juce::AudioBuffer& buffer, juce::MidiBuffer& midiMessages)
{
juce::MidiBuffer::Iterator iterator(midiMessages);
juce::MidiMessage message, currentMessage;
int samplePos;

processedBuffer.clear();

// flush midi notes received from editor:
while (mEditorNote.DataWaiting())
{
    mMidiNote val = mEditorNote.Read();
    if (val.on) message = juce::MidiMessage::noteOn(val.channel, val.note, (juce::uint8)val.velocity);
    else message = juce::MidiMessage::noteOff(val.channel, val.note, (juce::uint8)0);
    processedBuffer.addEvent(message, samplePos);
}

while (iterator.getNextEvent(currentMessage, samplePos))
{
    if (mEditorActive)
    {
        // hold and send to editor:
        if (currentMessage.isNoteOnOrOff())
        {
            mProcessorNote.Write(currentMessage.getChannel(), currentMessage.getNoteNumber(), currentMessage.getVelocity(), currentMessage.isNoteOn() ? true : false);
        }
        else
        {
            // pass-thru:
            processedBuffer.addEvent(currentMessage, samplePos);
        }
    }
    else
    {
        // pass-thru:
        processedBuffer.addEvent(currentMessage, samplePos);
    }
}
midiMessages.swapWith(processedBuffer);

}

I am “intercepting” midi note events, and send them to the editor (read: midikeyboard) for visualization.
At the same time, midi notes received from the editor (incl midikeyboard pressed, chord buttons pressed, etc) are added to the processedBuffer. I have two circular buffers for sending/receiving note data.
Works OK as standalone app, but not as plugin.

You did not answer my question: what buffer size are you using in your DAW (Cubase) when running your plugin?
If I understand your design correctly, you will always have latency anyway, since you are triggering events on the UI thread, which is by nature asynchronous, and low priority.
So if for any reason the UI thread does not have a lot of processing time (because the project is CPU intensive), then your UI will become less responsive. Since you trigger MIDI events from the UI, your messages will come whenever they can, which can be after a long delay.
This is a bad design.
You could allow the user to map buttons to a hardware MIDI keyboard. That way, the MIDI messages from the keyboard would come on their own thread, and you’ll be able to catch them in your audio callback without much latency.

44100Hz / 256 samples. The lag is way more than this, it’s several seconds.
Thanks for explaining.
Looks like a vst plugin can’t have the editor do a part of the sound generation calculations.
I will have to rethink my plugin design.

There are quite some vst’s on the market (i.e. Sylenth) that can trigger notes from the editor, without noticeable latency. I can accept some latency for this plugin.
Still think my main problem is in the (mis-)use of the samplePos variable, so any guidance in that direction would be appreciated!

If latency is acceptable for you then it’s ok.
samplePos is the offset of the message in the current buffer. So it should be 0 <= samplePos < bufferSize for the message to be played in the current buffer

With CCs the position might be acceptable when set to zero, but with note events you definitively need correct virtual timing.
For this you need to do the complete creation of the Midi events in the audio callback, nit in the GUI part of the plugin.
The “editor” and the “processor” of a plugin are individually called by the DAW. with completely unrelated timing. In fact the processor uses “virtual” timing based on sample blocks, otherwise offline rendering would not be possible. With “live” playing, the virtual timing can be several blocks off regarding the “real” time, according to the Audio device setting.
-Michael

Am aware of that. I added samplePos = 0 and all is fine.
Just wanted to be able to send midi from the GUI. Latency is not critical for this plugin.
Thanks