Trying to use SpectrogramComponent from the examples for a plugin

In the provided example SimpleFFTExample there is just a Main window being created with a SpectrogramComponent object being created within it’s constructor. So there is no need for a spectrogramcomponent->start() or something like that.
First I thought: Alright if this is how components work, so that their paint(), getNextAudioBlock() …etc… will get called automaticaly after the components construction …then let’s just try it. But there seems to be more about it and it might have something to do with the deriving of AudioAppComponent. I want to code a plugin, not an standalone app.

In a different topic I found something Jules said about it:

Not 100% sure what you’re asking… But if you’re asking how to take an AudioAppComponent and make it run a plugin, you probably want to use the AudioProcessorPlayer class, which is designed for that kind of purpose.

But I think I have to make the SpectrogramComponent derive from the AudioProcessor class instead of the AudioProcessorPlayer. But thats not all since I need to override paint() as well, which is not a part of the AudioProcessor component.
I’m not really sure how to approach this and if said component is the right one and which one I need for the paint override… can anyone help me out?

post your code

The main difference here is, that the AudioAppComponent calls getNextAudioBlock() for DSP actions, while AudioProcessor calls processBlock().

The other thing to note is, that in AudioAppComponent the lifetime of the GUI and DSP is coupled. In a plugin, the processor must be able to fully function without the GUI being present. The Host must be able to construct and destroy the GUI (AudioProcessorEditor) at any time, without any effects on the processor.

HTH

@matkatmusic:

    class SpectrogramComponent :   public AudioProcessor,
                                public Timer {

public:
    SpectrogramComponent() : forwardFFT (fftOrder, false),
    spectrogramImage (Image::RGB, 512, 512, true),
    fifoIndex (0),
    nextFFTBlockReady (false)
    {
        setOpaque (true);
        setAudioChannels (2, 0); 
        startTimerHz (60);
        setSize (700, 500);
    }
    ~SpectrogramComponent();
    
    void processBlock (const AudioBuffer<float> &bufferToFill, MidiBuffer &midiMessages) override
    {
        if (bufferToFill.buffer->getNumChannels() > 0)
        {
            const float* channelData = bufferToFill.buffer->getWritePointer (0, bufferToFill.startSample);
            
            for (int i = 0; i < bufferToFill.numSamples; ++i)
                pushNextSampleIntoFifo (channelData[i]);
                }
    }
    
    void pushNextSampleIntoFifo (float sample) noexcept
    {
        if (fifoIndex == fftSize)
        {
            if (! nextFFTBlockReady)
            {
                zeromem (fftData, sizeof (fftData));
                memcpy (fftData, fifo, sizeof (fifo));
                nextFFTBlockReady = true;
            }
            
            fifoIndex = 0;
        }
        
        fifo[fifoIndex++] = sample;
    }
    
    
    void drawNextLineOfSpectrogram()
    {
        const int rightHandEdge = spectrogramImage.getWidth() - 1;
        const int imageHeight = spectrogramImage.getHeight();
        spectrogramImage.moveImageSection (0, 0, 1, 0, rightHandEdge, imageHeight);
        forwardFFT.performFrequencyOnlyForwardTransform (fftData);
        Range<float> maxLevel = FloatVectorOperations::findMinAndMax (fftData, fftSize / 2);
        
        for (int y = 1; y < imageHeight; ++y)
        {
            const float skewedProportionY = 1.0f - std::exp (std::log (y / (float) imageHeight) * 0.2f);
            const int fftDataIndex = jlimit (0, fftSize / 2, (int) (skewedProportionY * fftSize / 2));
            const float level = jmap (fftData[fftDataIndex], 0.0f, maxLevel.getEnd(), 0.0f, 1.0f);
            
            spectrogramImage.setPixelAt (rightHandEdge, y, Colour::fromHSV (level, 1.0f, level, 1.0f));
        }
    }
    
    void paint (Graphics& g) override
    {
        g.fillAll (Colours::black);
        g.drawImage (spectrogramImage, getLocalBounds().toFloat());
    }
    
    void timerCallback() override
    {
        if (nextFFTBlockReady)
        {
            drawNextLineOfSpectrogram();
            nextFFTBlockReady = false;
            repaint();
        }
    }
    
    enum
    {
        fftOrder = 10,
        fftSize  = 1 << fftOrder
    };
    
private:
    FFT forwardFFT;
    Image spectrogramImage;
    
    float fifo [fftSize];
    float fftData [2 * fftSize];
    int fifoIndex;
    bool nextFFTBlockReady;
    
    JUCE_DECLARE_NON_COPYABLE_WITH_LEAK_DETECTOR (SpectrogramComponent)
    
};

Of course withouth deriving from the proper components, there is no Paint() to override and no getLocalBounds() to call.

That code doesn’t really make much sense. You shouldn’t be doing GUI stuff in your AudioProcessor class. (AudioProcessorEditor is the class that should be used for implementing the GUI in plugins.) Overriding the paint method with the override keyword shouldn’t even work, does that code really compile?

No it doesn’t. I just edited my post few seconds after posting it:

Of course withouth deriving from the proper components, there is no Paint() to override and no getLocalBounds() to call.

I just dont know how to change the class to do the same things just for a plugin.

You appear to be a bit confused how the Juce-based plugins should be structured. You need to implement 2 classes :

-An AudioProcessor subclass, which implements your audio processing and should preferably have almost nothing to do with the GUI. (Because plugins have to be able to run without a GUI.)
-An AudioProcessorEditor subclass, which implements the GUI for your plugin. AudioProcessorEditor itself inherits Juce’s Component class, so the paint, bounds etc related methods are available in that.

Because the AudioAppComponent is for a standalone application and not for a plugin, it implements both the audio and GUI parts in the same class, but you must not use that design directly for a plugin.

OK, I now looked at the SimpleFFTExample code and it appears the SpectrogramComponent isn’t really reusable directly for a plugin. It would need to be slightly rewritten to work in a plugin context.

I wrote a very rough plugin implementation with the changed SpectroGramComponent :

https://pastebin.com/fupT9hyp

Code for PluginProcessor.h is not included in the pastebin because no changes were required for the Projucer generated header file in this revision.

Hey Xenakios,

thank you very much for that code! :slight_smile:

You know, I was actually looking for a documentation that explains something about structural differences between standalone juce apps and plugins. Your code and daniels explenation is making things more clear to me.

Things that surprised me were:

addAndMakeVisible(&m_SpectroGramComp)

So juce will call the SpectrogramComponent::Paint() only when I tell him to add it and make it visible. Okay, easy.

and

auto myeditor = dynamic_cast<SpectrogramPluginAudioProcessorEditor*>(getActiveEditor());
if (myeditor != nullptr)
{
	myeditor->m_SpectroGramComp.processAudioBlock(buffer);
}

Because with getActiveEditor() it looks like we get the pointer to that editor object out of thin air.

So this code tells me, that every time the user requests UI, juce calls createEditor() and constructs a AudioProcessorEditor which in this case constructs a SpectrogramComponent. The function `getActiveEditor() passes me a pointer to the editor, from which I access the components processing function as long as the Editor is alive.

Why not
SpectrogramPluginAudioProcessorEditor* myEditor = getActiveEditor();

OR
Why not make the Processor class have a member that holds an SpectrogramPluginAudioProcessorEditor pointer?

Because pointers to types in an inheritance tree don’t work that way, the result from getActiveEditor() needs to be casted.

The AudioProcessor base class does have a pointer to the AudioProcessorEditor but because it is private, the getActiveEditor() method is used to get access to it. You preferably shouldn’t have your own pointer to the editor as a class member because the AudioProcessor base class can better track the editor object’s life time and will update the private pointer accordingly. Keep in mind the plugin’s audio processing side must not assume there is any kind of GUI at all, even if your plugin is something like an analyzer/visualizer.