I am having a problem with adding midi events in one plug-in host, where it works fine in another host. So I am trying to make code that works in all hosts. But I am a bit lost how I can solve this.
I am currently using the following code to add my own midievents to the event stream (stolen form MidiKeyboardState::processNextMidiBuffer):
void RibbonToNotesAudioProcessor::PlayNextMidiMessages(juce::MidiBuffer &midiMessages,
const int startSample,
const int numSamples)
{
const int firstEventToAdd = notesToPlayBuffer.getFirstEventTime();
const double scaleFactor = numSamples / (double) (notesToPlayBuffer.getLastEventTime() + 1 - firstEventToAdd);
for(const auto metadata : notesToPlayBuffer)
{
const auto pos = juce::jlimit (0, numSamples - 1, juce::roundToInt ((metadata.samplePosition - firstEventToAdd) * scaleFactor));
auto message = metadata.getMessage();
midiMessages.addEvent (message, startSample + pos);
}
notesToPlayBuffer.clear();
}
This works properly in Logic Pro. But I am using Blue Cat’s Patchwork for debugging my audio plugin (Since Logic will not allow debugging). It seems Blue Cat’s Patchwork gets confused with this code (only handling some of the events, with all kinds of weird hanging notes as a consequence). Both are using the same AudioUnit.
Apparently, there is a difference between the plug-in hosts. But how do I get this work in both hosts?
I have made a workaround by adding only one event for each process block (see code below). This works in Blue Cat’s Patchwork, but now in Logic you will get a very noticeable latency and strange behaviour if the sample buffer is > 512. So I added a second check that makes it work in Logic again, if the sample buffer > 512. If I now set the sample buffer above 512 in Blue Cat’s Patchwork, the problem still exists, so it is not a real solution. Just a workaround. If anyone has a proper solution for this problem, please let me know.
//WORKAROUND
void RibbonToNotesAudioProcessor::PlayNextMidiMessages(juce::MidiBuffer &midiMessages,
const int startSample,
const int numSamples)
{
juce::MidiBuffer tmpBuffer;
int i = 0;
const int firstEventToAdd = notesToPlayBuffer.getFirstEventTime();
const double scaleFactor = numSamples / (double) (notesToPlayBuffer.getLastEventTime() + 1 - firstEventToAdd);
for(const auto metadata : notesToPlayBuffer)
{
const auto pos = juce::jlimit (0, numSamples - 1, juce::roundToInt ((metadata.samplePosition - firstEventToAdd) * scaleFactor));
auto message = metadata.getMessage();
if(i < 1 || numSamples > 512)
{
// first message of the buffer is excecuted
midiMessages.addEvent (message, startSample + pos);
}
else
{
// other messages are copied to temporary buffer
tmpBuffer.addEvent(message, juce::Time::getMillisecondCounterHiRes() * 0.001 - startTime);
}
i++;
}
// swap the buffer, so the excuted message is removed from the buffer.
notesToPlayBuffer.swapWith(tmpBuffer);
}
DAWs usually expect that your plugin only posts MIDI events for the current buffer and not events that are ‘for the future’
So if you have events that are meant for the future, you need to maintain some state of the events, then at the start of each block figure out which of the events you planned to have posted are due for playback in the current block.
Usually that’s done by using something like the playhead/PPQ position coming from the host, to sync with the host timeline - not something like getMillisecondCounterHiRes() which would not work correctly in most host processing situations, as the blocks aren’t processed in the same speed as what the user is listening to them.
In an offline processing situation for example the host might call your plugin for all the buffers needed immediately for the full song, so the timer time reporting would only be milliseconds apart between blocks even when the events you’re posting are meant to represent minutes or hours of music.
Thank you for your reply, but I do not understand your answer. I am posting events that are to be played now, and not in some future.
To be clear, I am not using the millisecond counter (that is only used in the workaround to store messages in a temporary buffer). The events that I add to midiMessages, are added using the start sample + pos, not the millisecond counter. Furthermore, the events that I add, are actually supposed to be played immediately and not in some future. So I want them to be processed in milliseconds (and not in minutes or hours). That is why I use the same code of the MidiKeyboardState. I want my midi effect to be like a keyboard that immediately plays what the user is playing with some added notes.
So again, I am not sure what you mean.
If you’re posting events from a MidiKeyboardState or something similar in the GUI, you would normally post all the queued events once at the start of the buffer at sample pos #0.
If you’re trying to capture the timing of those events (say if you want the user to play rhythmically on screen, and then post those events into the buffer), then you would need to create some model that posts events to the live/DAW buffer based on the timeline inside of every buffer, but in that case you would need to debug your logic and check that you actually posted the events at the correct position.
If you’re posting events from a MidiKeyboardState or something similar in the GUI, you would normally post all the queued events once at the start of the buffer at sample pos #0.
Yes, that is what I am trying to do. Is the code (see my first post) correct for doing this?:
void RibbonToNotesAudioProcessor::PlayNextMidiMessages(juce::MidiBuffer &midiMessages,
const int startSample,
const int numSamples)
{
const int firstEventToAdd = notesToPlayBuffer.getFirstEventTime();
const double scaleFactor = numSamples / (double) (notesToPlayBuffer.getLastEventTime() + 1 - firstEventToAdd);
for(const auto metadata : notesToPlayBuffer)
{
const auto pos = juce::jlimit (0, numSamples - 1, juce::roundToInt ((metadata.samplePosition - firstEventToAdd) * scaleFactor));
auto message = metadata.getMessage();
midiMessages.addEvent (message, startSample + pos);
}
notesToPlayBuffer.clear();
}
Because as explained: it works properly in Logic Pro, but not in Blue Cat’s Patchwork.
By keeping it to one midi message per proces block, it seems to work properly in Blue Cat’s Patchwork (see workaround). But obviously this is less than ideal, since this introduces unnecessary latency. With a small buffer size this is doable, but with larger buffer size it is not.
Anyway, this might be a problem specific to Blue Cat’s Patchwork as a host. May be this is not a problem that I can solve in the code itself. On the other hand, Blue Cat’s Patchwork does not have a problem with the direct input of a keyboard with many midi notes at once. So it still feels as if there should be some solution possible.