MIDI + offline rendering


#1

Howdy how!

I’ve managed to fork the plugin host to work as an offline rendering tool, and so far that feature has worked smoothly. I wanted to be able to render tracks out with basic DAW-like features implemented, meaning that I have a system for parameter automations and midi input for VSTi. These both work fine - as long as I render in realtime - but for some reason that MIDI data I send to the graph doesn’t seem to be doing anything with any of the VSTi-synths I’ve tried so far when I use the offline rendering side.

What I know for a certain:

  • isNonRealtime gets flagged properly for all the VST/VSTi in the graph for the offline render
  • The MIDI data I push to the graph does flow properly in the graph also in the offline render mode
  • When I render realtime, everything just works. No problems there.

My solution is as follows:
I have replaced filtergraph member AudioProcessorGraph with my CustomAudioProcessorGraph-class, that is derived from the AudioProcessorGraph so that I’m able to hook into the needed methods for implementing parameter automation / MIDI handling (prepareToPlay, releaseResources, processBlock, acceptsMidi, producesMidi). In the processBlock-method I implement my parameter automation and MIDI handling, and then I call the original processBlock method from AudioProcesorGraph to make things actually happen. At first I used to send messages through MidiMessageCollector, but then I read from the docs that "Collects incoming realtime MIDI messages and turns them into blocks suitable for processing by a block-based audio callback. " so I replaced the functionality for offline rendering via pushing the midi messages directly to the midiMessages stream that processBlock gets as a second parameter. Both of these methods work when I run realtime, but neither of them produces any output - other than silence that is - when running offline.

I am quite sure that I’m doing something horribly wrong in here, as I’m not confident at all that I’m using these properly, but I just can’t figure out any other way of implementing MIDI data recording and playback functionality.

Here is the crucial part of my solution (processBlock):

void CustomProcessorGraph::processBlock (AudioBuffer< float >& buffer, MidiBuffer& midiMessages) {
    
    if(playing) {
        
        // calculate time delta in milliseconds
        double delta = buffer.getNumSamples() / this->sampleRate;

        // because of certain recording related reasons we need to use different methods for midi data handling in offline vs realtime mode
        if(isNonRealtime()) {
            int nextEventIndex = this->midiMessages.getNextIndexAtTime(time * 1000); // using millisecond resolution
            if(nextEventIndex != this->midiMessages.getNumEvents()) {
                double nextEventTime = this->midiMessages.getEventTime(nextEventIndex) / 1000; // using millisecond resolution
                const MidiMessage message = this->midiMessages.getEventPointer(nextEventIndex)->message;
                if(nextEventTime >= time && nextEventTime < time + delta) {
                    DBG((String)time + " : # " + (String)nextEventIndex + " @ " + (String)nextEventTime + " : " + message.getDescription() + " // " + (String)delta);
                    int eventDelta = (int)(nextEventTime - time + delta) * sampleRate;
                    midiMessages.addEvent(message, 0);
                }
            }
            
        } else {
            // if recording, add messages from incoming midiMessages stream to our local MidiMessageSequence that is used to handle the saving and loading of midi data
            if(recording && !midiMessages.isEmpty()) {
                juce::MidiBuffer::Iterator iterator (midiMessages);
                juce::MidiMessage msg;
                int sampleNum = 0;
                while (iterator.getNextEvent (msg, sampleNum))
                {
                    addMessageToList(msg);
                }
            } else {
                // else insert loaded midi event messages to our players midimessages stream
                int nextEventIndex = this->midiMessages.getNextIndexAtTime(time * 1000); // using millisecond resolution
                if(nextEventIndex != this->midiMessages.getNumEvents()) {
                    double nextEventTime = this->midiMessages.getEventTime(nextEventIndex) / 1000; // using millisecond resolution
                    const MidiMessage message = this->midiMessages.getEventPointer(nextEventIndex)->message;
                    if(nextEventTime >= time && nextEventTime < time + delta) {
                        DBG((String)time + " : # " + (String)nextEventIndex + " @ " + (String)nextEventTime + " : " + message.getDescription() + " // " + (String)delta);
                        // int eventDelta = (int)(nextEventTime - time + delta);
                        sendMidiMessage(message);
                    }
                }
            }
        }
        
        // search for automation changes and calculate new change vectors when necessary
        for(int i = 0; i < automations.size(); i++) {
            for(int j = 0; j < automations[i]->values.size(); j++) {
                AutomationPoint* ap = automations[i]->values[j];
                if(ap->t >= time && ap->t < time + delta) {
                    if(j == 0) ap->p->setParameter(ap->i, ap->v); // if encountering the very first data point, set it as the begin value
                    if(j + 1 < automations[i]->values.size()) { // check if this is the last parameter in the current set
                        // if it is not, calculate the change vector
                        float v = -(automations[i]->values[j]->v - automations[i]->values[j + 1]->v) * ( 1 / (automations[i]->values[j + 1]->t - automations[i]->values[j]->t)) * delta;
                        bool found = false;
                        for(int p = 0; p < adjustments.size(); p++) {
                            if(adjustments[p]->p == ap->p && adjustments[p]->i == ap->i) {
                                adjustments[p]->v = v;
                                found = true;
                                break;
                            }
                        }
                        // this is the first one for this processor and parameter, so create a new automation vector
                        if(!found) adjustments.push_back(new AutomationVector(ap->p, ap->i, v));
                    } else {
                        // was the last, so make sure to erase the adjustment as it is not needed any more, and set the last value to the given last data point
                        for(int p = 0; p < adjustments.size(); p++) {
                            if(adjustments[p]->p == ap->p && adjustments[p]->i == ap->i) {
                                adjustments.erase(adjustments.begin() + p);
                                break;
                            }
                        }
                        ap->p->setParameter(ap->i, ap->v);
                    }
                }
            }
        }
        
        // adjust parameters according to the calculated change vectors
        for(int i = 0; i < adjustments.size(); i++) {
            adjustments[i]->p->setParameter(adjustments[i]->i, adjustments[i]->p->getParameter(adjustments[i]->i) + adjustments[i]->v);
            // DBG((String)i + ": " + (String)adjustments[i]->v);
        }
        
        // update time + GUItime
        time += delta;
        GUIUpdateAccumulator += delta;
        
        // update GUI if
        if(GUIUpdateAccumulator >= GUIUpdateStep) {
            mainControls->updateTime();
            GUIUpdateAccumulator -= GUIUpdateStep;
        }
    }
    
    AudioProcessorGraph::processBlock(buffer, midiMessages);
}

…and this is the way I hook up the system in the graph:

The midiMessages stream that I push will get passed down to every synth that the Midi In node is connected, and it is implemented just as it is in an unmodified plugin host code base. And it works realtime, as stated. And I can see the messages getting passed in the offline mode also, when I run a debug build.

I have inlined everything to this one method for this example so that it should contain everything MIDI related I’ve implemented so far. And yes, I know that there is a lot of suboptimal solutions currently that could be improved, but please not that I have confirmed everything above to be working just fine when running realtime, so the implementation itself is logically correct, works, and procudes proper results. I’m going to improve on the mechanisms as soon as I just get things done right for both of the rendering modes, realtime and offline.


#2

Where do the midimessages disappear? Do you feed them directly into CustomProcessorGraph::processBlock?


#3

Don’t know where they dissappeared, just managed to get it to work today just by retracing my steps an reimplementing basically the same I had done already. But now it crashes on windows :slight_smile: Oh well…

But yes, I just mangle the midiMessages object before I pass it on to the underlying AudioProcessorGraph on the last line of the CustomProcessorGraph::processBlock. The midiMessages get populated by the midi keyboard component that sits in the bottom of the plugin host window. It passes them to the CustomProcessorGraph, and inside that graph they get passed to any AudioProcessor that is connected to the Midi In node.

Anyway, now it seems to work, even while I don’t have the slightest clue why it didn’t work before. All I have to do now is to figure out why these changes cause a crash on windows but not on OSX…