How to create global pointer to AudioBuffer

I lack knowledge with pointers. I am in the middle of making a synth plugin and need to access the sample date in the AudioBuffer to a custom visual scope.

In PluginProcessor.cpp there is;

void MySynthAudioProcessor::processBlock (AudioBuffer<float>& buffer, MidiBuffer& midiMessages)

and inside there I know I can loop through the samples using;

for (int sample = 0; sample < buffer.getNumSamples (); sample++)
	{
		outputWaveLeft[outputPlot] = buffer.getSample (0, sample);
        outputWaveRight[outputPlot] = buffer.getSample (1, sample);

		outputPlot++;
		if (outputPlot >= scopeSize) outputPlot = 0;

The scope function is in a JUCE component, and is called from PluginEditor via a Timer 30 times a second.

So my questions is; how do I access the AudioBuffer data in another JUCE component? I think I have to make a global pointer, or at least one that is accessible both in my Scope component and PluginProcessor, but don’t know how.

Sounds tempting, but don’t ever do that!
a) the referenced buffer is owned by the processor, which may move, dispose or recreate it in the background without your knowledge
b) the buffer would then be accessed by two threads, which is undefined behaviour, unless you add a lock, and that is not ok in the audio thread.

Usually you have an extra buffer allocated in prepareToPlay that acts as FIFO, where the audio thread can always push its samples and the displaying thread fetches them for displaying.

1 Like

All right thanks. What I have right now, the above code, which is actually inside PluginProcessor - processBlock, works fine.

Only problem is that the sample data is usually copied to my plot array to slow or too often due to different sizes of the audio buffer and how often my Timer is called. I guess I can just use a test flag in above code, to check whether it is time to copy samples when paint() is about to do its work in my component.

The problem is, if you are writing into outputWaveBuffer, while the component is reading from it. It will probably just be a little visual glitch, but it is undefined behaviour and can bite you in the backside…

Watch this talk: Fabian and Dave: Realtime 101 (ADC 2019)

Which is why with a flag, I will copy it first, and then read it to display.

So this is how I am getting the buffer data to my scope component.

In PluginProcessor processBlock I got;

mutineer.renderNextBlock (buffer, midiMessages, 0, buffer.getNumSamples ());

	if (readSampleData & 1)
	{
		for (int sample = 0; sample < buffer.getNumSamples (); sample++)
		{
			outputWaveLeft[outputPlot] = buffer.getSample (0, sample);
			outputWaveRight[outputPlot] = buffer.getSample (1, sample);

			outputPlot++;
			if (outputPlot >= wtSize) outputPlot = 0;
		}
		readSampleData = 2;
	}

And in the scope component called by a timer;

        readSampleData = 1;
		while (readSampleData == 1); // Wait for data to become available
		readSampleData = 0;
		drawOutput (g, windowWidth, windowHeight, plot, plotWidth, 
        plotHeight, plotY, plotStep);

Surely it is not the most elegant way to do it, but hey it works, just check the below video of the synth I am working on.

1 Like

That’s not the greatest idea, you should not really wait for anything in the GUI thread. You may get away with it if the wait times are “short” but can you guarantee that is always going to be the case? Remember also that your plugin is sharing the GUI thread with the host application and all other plugins running in it. If your plugin spins around waiting, you are stealing CPU time from the host and the other plugins.

1 Like

Thank you very much for pointing that out. I must have had a brain fart, as I can’t remember last doing this :slight_smile: I now reworked my code to this, with a much decreased CPU usage, and now also if no sound is played, CPU usage is virtually nil.

In the PluginProcessor - processBlock I got, and note the change that I check for “Magnitude”;

mutineer.renderNextBlock (buffer, midiMessages, 0, buffer.getNumSamples ());

	if (readSampleData & 1 && buffer.getMagnitude(0, buffer.getNumSamples()))
	{
		for (int sample = 0; sample < buffer.getNumSamples (); sample++)
		{
			outputWaveLeft[outputPlot] = buffer.getSample (0, sample);
			outputWaveRight[outputPlot] = buffer.getSample (1, sample);

			outputPlot++;
			if (outputPlot == wtSize)
			{
				outputPlot = 0;
				readSampleData = 2;
				break;
			}
		}
	}

and in my plot component I changed hiresTimer to regular Timer and did this;

void WaveformOutput::timerCallback ()
{
	if (readSampleData == 0)
		readSampleData = 1;
	else if (readSampleData == 2)
		repaint ();
}

void WaveformOutput::paint (Graphics& g)
	readSampleData = 0;
	drawOutput (g, windowWidth, windowHeight, plot, plotWidth, plotHeight, plotY, plotStep);
}

Off course with that solution I had to double the timer speed to keep it updating paint same as before.

I know I’ve been promoting this a lot recently but you should check out mine and Fabian’s talk from ADC this “Real-time 101”. It goes through all these sorts of problems and gives appropriate solutions:
Part 1: https://youtu.be/Q0vrQFyAdWI
Part 2: https://youtu.be/PoZAo2Vikbo

3 Likes

Problem is there is only 24 hours in a day! Now if only there was 48 :slight_smile:

Well I can pretty much guarantee watching this 90 mins will be one of the most effective uses of those 24 hrs and could save a lot of time in the future :wink:

2 Likes

Got it on my list :slight_smile:

Ok I watched both videos, very interesting stuff. But as you probably imagine I have absolutely no experience with this stuff (atomic, mutex, wait etc), so if you or anyone else can offer a quick example to get me started, of how I could better do the above, getting plot data from audio bufer, I would sure appreciate it, thanks!

The talk specifically mentions this use case. You probably want to use a farbot::RealTimeMutatableObject or perhaps a farbot FIFO if you want to do the processing on the message thread.

Thanks for that, however I fail to understand why my code above is not good enough as I only collect the data after renderNextBlock is done, or in-between rather. That data should then be totally independent as it is used in my other component being accessed via Timer and Paint(), and Paint() only being called if data is collected.

I tried to use RealtimeMutatable the other day and ran into some issues, which I logged on the farbot repo. Is the library still in active development? In its current state, the RealtimeMutatable class doesn’t seem to be usable.