MIDI renderToFile outputing empty .wav files

My goal: build a juce/tracktion program that loads a virtual instrument plugin (into an edit?), reads a midi file into the edit, and renders .wav audio in non-realtime.

I have consulted the answer to this post and added a new method with that code into the MidiRecordingDemo to test it out on recorded midi clips. I’m supplying clips to that method with the following code:

void renderAllMidiClips() {
    auto tracks = tracktion_engine::getAllTracks(*edit);
    int trackIdx = 0;
    for (auto track : tracks) {
        if (auto audioTrack = dynamic_cast<tracktion_engine::AudioTrack*>(track)) {
            for (auto clip : audioTrack->getClips()) {
                if (auto c = dynamic_cast<tracktion_engine::MidiClip*>(clip)) {
                    renderMidiClip(clip, trackIdx);

With some console output I’ve verified that I am indeed only running the render function for just the midi clips I’ve created, and the .wav files are being generated with what looks like the right length. However the files have no sound. Also I’ve noticed that after a render, playback of the midi or playing my midi controller doesn’t produce sound anymore. Wondering if I’m doing anything wrong, any ideas to fix this?

What’s the definition of renderMidiClip?

You’ll probably need to call TransportControl::restartAllTransports to reconnect any Edits to the DeviceManager after a render.


void renderMidiClip(tracktion_engine::Clip* c, int trackIdx) {
    auto& edit = c->edit;
    auto clipPos = c->getPosition();
    BigInteger tracksToDo;
    tracksToDo.setBit (trackIdx);
    Array<tracktion_engine::Clip*> clips;
    clips.add (c);
    const bool usePlugins = true;
    File dir = File::getSpecialLocation (File::userDesktopDirectory);
    auto f = dir.getNonexistentChildFile (File::createLegalFileName (c->getName()), ".wav");
    tracktion_engine::Renderer::renderToFile (TRANS("Render Clip"), f, edit, clipPos.time, tracksToDo, usePlugins, clips, false);
    edit.getTransport().restartAllTransports(engine, true);

As per your suggestion I added the transport line, same behavior though (regardless if clearDevices = true or false).

@dave96 would be great if i could get a solution for this, hopefully I’m forgetting something obvious

Sorry, I’m struggling a bit to keep up with everything whilst writing this new tracktion_graph module. It’s a big project and taking most of my brain power.

Can I clarify some things:

  1. Does the render actually work as you expect? I.e. do you get the wav files with audible content?
  2. After the call to restartAllTransports, is createPlaybackAudioNode in tracktion_EditPlaybackContext.cpp actually called?
  3. If so, is EditPlaybackContext::fillNextAudioBlock subsequently called?

No worries :slight_smile:

  1. No the render does not work. while wav files are generated, they contain just silence

  2. by placing a breakpoint in createPlaybackAudioNode() I see it is called twice when renderToFile() is called, however its not called at all when restartAllTransports() is called. So to answer your question, no. Though looking at the call hierarchy I don’t see a point where it calls that from restartAllTransports()

An aside: not all the buttons are visible in the MidiRecordingDemo, so I’ve put them on two rows so they will fit, I can file a push request for that. Also the memory allocation seems to increase when the application is left alone, so there might be a memory leak somewhere

Would it be possible to upload the whole file as a pip so I can see what’s happening when you press the render button.

createPlaybackAudioNode shouldn’t be called during the render process, only outside of it. For rendering, createRenderingAudioNode is used (which doesn’t output to hardware devices).

I imagine there’s some problem with including the MIDI clip contents in the render if only blank files are created.

Below is the stack track I see for createPlaybackAudioNode() called from renderToFile

I’ve attached my examples/MidiRecordingDemo.h file since thats the only file changed, you can run it but replacing it with the same file in the MidiRecordingDemo example, just load a vst, record some random midi in the track, and click the “Render” button I’ve added
MidiRecordingDemo.h (14.5 KB)

One other thing I’ve noticed: As I mentioned, after finishing the render, when I playback the midi no sound is outputted. However when I open the synth plugin I’m using, and hit the virtual keyboard on it’s UI, I can make sounds with that. Furthermore, if I delete and then re-add the synth plugin, I am able to hear sound again upon midi playback. So I think the issue must be somewhere between the midi clip playback and the plugin.

Hi, checking in if anyone has any insights

So I just grabbed the latest tracktion_engine, replaced the examples/MidiRecordingDemo.h file with the one you provided above, generated the examples with the script in the repo, built and run it.

I added an instance of 4OSC to the track, used MidiKeys for MIDI input, recorded a few seconds of MIDI and then rendered it. The resulting audio file had the MIDI notes in it. So as far as I can tell, the rendering process is working fine.

The only problem I can see if that the Edit it then silent afterwards.
To fix this, change line 114 to edit.getTransport().ensureContextAllocated (false);
(from edit.getTransport().restartAllTransports(engine, true);).

That should force the Edit to get re-attached to the DeviceManager.
I thought restartAllTransports did that but I must have been mistaken.

Does that solve all your problems?

1 Like

Aha, the problem seems to be with the plugin! When I use the 4OSC everything works. The plugin I aim to use however, (https://github.com/mtytel/helm) gives the same issue as before. Can rendering be done with any synth or specific ones? If so why? I assumed VSTs would all have some standard of functionality

Are you using the VST3 version of that plugin?
I think there’s a JUCE bug that removes MIDI input after releaseResources is called.
We added a work around for this in tracktion_engine recently though.