How to time shift incoming midi notes?

if you’re working with live midi input, then respacing it into the past is impossible. The best you can do is:

  • make the plugin always have, for example, 20 ms latency – meaning that every note you play, would come out of the plugin 20 ms later, because 20 ms later is the plugin’s “now”
  • to respace “into the future”, delay a note by only 5 or 10 ms instead of 20.

Then you can say “wow, look, the note is in the future!” in relationship to the plugin’s sense of “now”

But to the human performer, the note still sounds after you touched it

It is not physically possible for a plugin to make a note sound before the performer has played it.

1 Like

yes, just like i said. not physically possible on record and monitor. but srsly most plugins just have lookahead features and some don’t even let the user turn them off anymore (like soothe2) because in almost all contexts ever if lookahead is a possibility it should also be turned on. almost always makes things better.

but i still think that is not what sa1 should focus on now. i’d do this:

  1. implement an audio delay (feed forward)
  2. implement a midi delay (monophonic)
  3. implement a midi delay (polyphonic)
  4. do whatever you want with that

because that is the sequence of difficulty to get to the final project

I forget to say VST is not supposed for live input. It s goal is to quantize already recorded or programmed midi data

will try this strategy tommorow. not shure how achive this . mostly trial an error…

here, take this circular buffer tutorial. circular buffer = ring buffer. and in this video josh explicitely implements this as a feed forward audio delay. it is quite an optimized way to write it out, but a way how it can be done with audio. and remember: audio is easier than midi. you could even ignore stereo inputs to make it easier as a good starting point

1 Like

Regarding ARA : Would this be the way ? GitHub - Celemony/JUCE_ARA: The JUCE cross-platform C++ framework, augmented with support for the Celemony ARA API

Currently cracks my brain a little bit. just realized that processBlock does not seem to be linear event loop. Instead some kind of conditional.

I tried to implement prototype of some kind of raw grid step detection inside.

    bool updateCurrentTimeInfoFromHost(AudioPlayHead::CurrentPositionInfo& posInfo)
    {
        if (auto* ph = getPlayHead())
        {
            AudioPlayHead::CurrentPositionInfo newTime;

            if (ph->getCurrentPosition(newTime))
            {
                posInfo = newTime;  // Successfully got the current time from the host.
                return true;
            }
        }

        // If the host fails to provide the current time, we'll just reset our copy to a default.
        currentPositionInfo.resetToDefault();

        return false;
    }

    template <typename Element>
    void process (AudioBuffer<Element>& audio, MidiBuffer& midi)
    {
        audio.clear();
        myMidiBuffer.clear();
        saMidiBuffer.clear();
 

        if (this->currentPositionInfo.isPlaying) {
            if (this->currentPositionInfo.timeInSamples % 500 == 0) {
                this->logMidiEvent();
                midi.swapWith(myMidiBuffer);
            }
        }
        this->updateCurrentTimeInfoFromHost(currentPositionInfo);

This works for the first HALF:rage: loop as I expected but ends with big surprise :clown_face:

btw-ticks

If I change the loop inside my daw. NOTHING:nerd_face: happens anymore. So I dont know If I ever will manage this. Pretty upset right now.

From what your video shows I need to invest much more math inside, but I wonder if I miss a very simple liniear eventhandler triggering on DAW’ s transport change/progress. feels processBlock is not the right place here. From what I have learned so far, I cannot influence the midi event timing via a property. All that has worked so far is to create a midi event at the real time, add it to the midi buffer and make a swap. :thinking:

forget the idea with “changing a property”. every time you want to work with the audio- or midi data given by the DAW to the plugin you are doing that in processBlock. every time you want to make a delay you need to implement another buffer whose purpose it is to remember past data.

// simplified code for an audio delay:

// this could be std::array, std::vector, juce::Array.. doesn't matter
// make sure that the size of this thing is >= the max length of the delay in samples
array<float> ringBuffer;

// used to write to and read from the ringbuffer. they must always
// be in bounds with the delay
int writeHead, readHead;

// a method for setting a delay in samples
void setDelay(int d) {
    readHead = writeHead - d;
    while(readHead < 0)
        readHead += ringBuffer.size();
}

// in processBlock (considering mono input)
auto samples = buffer.getWritePointer(0);
for(int s = 0; s < buffer.getNumSamples(); ++s) {
    writeHead = (writeHead + 1) % ringBuffer.size();
    readHead =  (readHead + 1) % ringBuffer.size();

    // this is the part that makes this a "ring" buffer
    ringBuffer[writeHead] = samples[s];
    samples[s] = ringBuffer[readHead];
}

// if you want to hear how the delay sounds mixed with the dry signal
// just put a += instead of = in the last line of the sample loop

this code is the least optimized way to write a ringbuffer i can think of and should work well to show the way it works. and really. try this and with audio before going on with your midi project. it really is a little bit more complicated with midi because of its polyphony and stuff, so it would be smart to start simple