Handling MIDI output in standalone app

Hi!

I’ve been building an app that converts audio to midi by using fft. I use the performFrequencyOnlyForwardTransform() method and evaluate the frequency bin and amplitude to generate MIDI messages (I determine the fundamental frequenzy of the fft with a method). Until now I’ve just created note-ons and logged them to the screen to debug and both the fft and midi messages have been working fine. Both the fft and the midi-generation were contained to a timer callback with frequenzy 120 Hz.

Now, I want to start actually outputing midi to my DAW. I have set up virtual “midi cabeling” with LoopBe1 to route into my DAW. And yes, I know it would have been easier to make a plug-in, but I wanted to create a virtual midi port. Midi messages are being put into a buffer and I have a system for creating note-on/note-off pairs. I’ve used this thread as a guide, and have put both the fft-analysis and midi outputing in the audio thread (following this advice).

The midi part of my getNextAudioBlock():

    midiOut = deviceManager.getDefaultMidiOutput();
    if (midiOut)
    {
        midiOut->startBackgroundThread();
    }

    if (midiOut != nullptr)
    {
        midiOut->sendBlockOfMessages(midiBuffer, juce::Time::getMillisecondCounter(), deviceManager.getAudioDeviceSetup().sampleRate);
        midiBuffer.clear();
    }

My problem is that I have severe performance issues and I highly suspect is has something to do with putting a lot of pressure on the audio thread. However, I don’t know how I should approach this. My midi does actually sometimes go through and produces the correct midi note in my daw, but it is really slow. So I need some tips on how best to handle this kind of midi output. Do I set a “buffer size” and empty my buffer when it reaches a certain size? Do I have a separate thread for midi, and would that be implemented with a timer? How would that keep up with the real time speed I need for a midi device?

Do you need to be outputting MIDI directly to a device?

If you need tight and correct timing, you should be using the MIDI buffer in the processBlock callback. That way you have control over the timing of MIDI events down to an individual audio sample.

2 Likes

Well, I’m trying to make an audio input behave like an MIDI instrument, so I need real-time MIDI output that can be recorded by a daw.

I do not think I am able to use the processBlock callback. Isn’t that part of the Juce plugin application, and not part of the standalone audio application? From my understanding the getNextAudioBlock callback is working on the audio thread, but I’m not sure.

processBlock is for AudioProcessors (either in a plugin or in a standalone that uses AudioProcessors inside). In your case (standalone) getNextAudioBlock does exactly the same so all the audio stuff is processed there.

1 Like