Some general how-to questions

If these have already been answered, perhaps someone could just send me links to the appropriate entries, but I’ve had very little luck using search to find these answers

  1. I’m not sure I understand how I’m supposed to use things like numSamples, samplesPerSecondForBuffer and so forth in the context of MIDI events which already have associated millisecondCounter timestamp values ----

  2. Why does the MIDIOutput class have “getVolume” and “setVolume” methods with floating point stereo values? Do these somehow get translated into CC7 and CC10 (volume/pan respectively) events?

  3. Is there a class somewhere in the API that implements a priority heap for MIDI events such that I can insert future events and then always be able to get at the next event that needs to be sent out? I thought I had seen such a thing but I can’t find it now and I don’t feel like reimplementing it if I don’t have to. (Does anyone know whether the standard STL priority heap is sufficiently efficient for real-time use?)

  4. I have noticed that even though I am not doing any audio processing, the level meter on the channel strip where I have the plugin instantiated often indicates that some audio is coming through — is there something I should be doing in my processBlock callback to make sure that zero audio is coming through?

Thanks in advance

I´m a rather new user of juce myself, but I think I can answer your questions. a plugin, the MidiBuffer object is used for midi-input/output. You get a MidiBuffer object as a parameter of your processBlock method, which contains all incoming midi-events. All events that are inside the MidiBuffer after your processBlock method has been processed, are put out. MidiBuffer uses sample-offsets (from the beginning of the current block) instead of the timestamp values of the single MidiMessages, which are ignored. So you will have to convert your “musical”-timestamps to sample offsets. The getCurrentPosition() method will give you the information you need for this (samplerate, ppqPosition, bpm ect.).

  1. to be honest: I don´t know, what these methods do, but I´m pretty sure, you don´t need a MIDIOutput object for your plugin. Just add all midi-events, that you want your plugin to output to the MidiBuffer. If you want to output this event from a certain hardware output, you just do that routing inside your host.

  2. maybe MidiMessageSequence will help you? STL should be no problem either, but I´m no expert at c++.

  3. buffer.clear()

Have a look at the plugin example and, of course, the juce demo. They are a great help.

First, thank you for your responses, I appreciate them

  1. That makes total sense — thanks for the explanation. Part of the confusion for me is that I am not actually doing ANY audio processing so it isn’t clear to be exactly what sample-offsets I’m actually getting.

  2. The problem here as I understand it is that Mac Audio Units don’t directly support MIDI output. So if I want to have an audio plugin respond to processed MIDI, I have to use two channel strips, one containing my MIDI plugin and which sends my processed MIDI (e.g, I would play a single note and the MIDI processor might generate a chord from it) through the MidiOutput to a virtual MIDI port. The second channel strip receives MIDI from that virtual port.

  3. I was able to use the STL priority_queue but I don’t think it’s a good permanent solution as, at least if using the default vector to contain the MIDI events due to memory constantly being allocated/freed. I have to look at how memory allocation works but I’m seeing huge spikes in CPU as well as dropouts and that shouldn’t happen as I’m doing VERY basic MIDI processing

  4. Yeah, I’m doing that buffer.clear()

By the way, I derived my project from that demo audio-plugin

What’s wrong with a MidiBuffer for holding your Midi message list? That’s what it’s designed for, after all…

  1. The CurrentPositionInfo struct, that is filled by the getPlayHead()->getCurrentPosition() method, has a member called ppq-Position. I find the name and the first sentence of the description in the documentation a little misleading, as ppq (pulses per quarter) normally stands for the “resolution” of a sequencer. But in fact this number tells you, how many quarters have passed in your song until the beginning of the current block. All sample-offsets relate to that point.

  2. Yes, midi-plugins and audio unit is a problem. I´m working on a little sequencer-plugin myself at the moment. There has to be a (maybe rather new) way to output midi-events from a au. Numerology from does exactly that in the latest version for at least some hosts (logic, live…), but I have no idea how it works and if it is (or will be) possible with juce. The problem with using MidiOutput is, that the host usually controls the in/outputs, so I´m not sure that your plan will work. Is there a special reason, that you are doing a AU and not a VST plugin? Do you need a plugin that works with logic? With VST, midi output is no problem.

When I looked at that class, it looked like the only thing I could really do was insert events and leave them to be played later. But I need to be able to schedule events that actually might go away before they get played or for which other processing has to happen that won’t be known until it’s actually time for a particular event to be played.

The priority_queue isn’t actually sufficient either but remember that I am just trying to get myself up to speed here and understand the basic architecture and what is possible.

Not logic but actually with MainStage, which also only supports AU.

i dont think this is true… this is the playhead position given from the host. it has nothing to do with the current block