OpenGL - blur a component

Well - here’s my shonky OpenGL to get that far in case it’s useful. I have only about 50% of a clue what I’m doing at the moment, and no idea why my image appears where it does at the size it does :slight_smile:

#pragma once

#include "../JuceLibraryCode/JuceHeader.h"

//==============================================================================
/*
    This component lives inside our window, and this is where you should put all
    your controls and content.
*/
class MainComponent : public Component, Timer, OpenGLRenderer
{
public:
    MainComponent()
    {
        setSize (600, 400);
        context.setRenderer(this);
        context.setComponentPaintingEnabled(false);
        context.setContinuousRepainting(true);
        context.attachTo(*getTopLevelComponent());
        startTimer(100);
    }

    void paint (Graphics&g) override
    {
        g.fillAll(Colours::blue);
    }

    void timerCallback() override
    {
        repaint();
    }

    void resized() override
    {}

    void newOpenGLContextCreated () override
    {
        shader = new OpenGLShaderProgram(context);
        shader->addFragmentShader(shaderWotBlurs);
        shader->link();
        createTexture();
    }

    void renderOpenGL () override
    {
        OpenGLHelpers::clear (Colours::black);
        glDisable(GL_LIGHTING);
        glColor3f(1, 1, 1);
        glEnable(GL_TEXTURE_2D);
        texture.bind();
        shader->use();
        // Draw a textured quad
        glBegin(GL_QUADS);

        auto w = getWidth();
        auto h = getHeight();

        glTexCoord2f(0, 0); glVertex3f(0, 0, 0);
        glTexCoord2f(0, h); glVertex3f(0, h, 0);
        glTexCoord2f(w, h); glVertex3f(w, h, 0);
        glTexCoord2f(w, 0); glVertex3f(w, 0, 0);
        glEnd();
    }

    void openGLContextClosing () override
    {}

    class CompToRenderWithOpenGL : public Component
    {
    public:
        void paint (Graphics& g) override
        {
            g.fillAll(Colours::red);
            g.setColour(Colours::white);
            g.drawText("Hello world", 0, 0, 100, 20, Justification::centred, false);
        }
    };

    void createTexture()
    {
        CompToRenderWithOpenGL comp;
        Image im{ Image::ARGB, getWidth(), getHeight(), true };
        comp.setSize(getWidth(), getHeight());

        {
            Graphics gi{ im };
            MessageManagerLock lock;
            comp.paintEntireComponent(gi, true);
        }

        texture.loadImage(im);
    }

private:
    OpenGLTexture texture;
    ScopedPointer<OpenGLShaderProgram> shader;
    OpenGLContext context;

    String shaderWotBlurs =
        "uniform sampler2D sampler0;"
        "uniform vec2 tc_offset[9];"
        "void main()"
        "{"
        "     vec4 sample[9];"
        "      for (int i = 0; i < 9; ++i)"
        "           sample[i] = texture2D(sampler0, gl_TexCoord[0].st + tc_offset[i]);"
        ""
        "   gl_FragColor = (sample[0] + (2.0 * sample[1]) + sample[2] +"
        "        (2.0 * sample[3]) + sample[4] + 2.0 * sample[5] +"
        "        sample[6] + 2.0 * sample[7] + sample[8]) / 13.0;"
        "}";

    JUCE_DECLARE_NON_COPYABLE_WITH_LEAK_DETECTOR (MainComponent)
};

No idea what’s going on with the co-ordinate space, and do I need some kind of vertex shader here…?

When you’re working with vertices (including glTexCoord2f), the coordinate space will be in the range from -1, 1, here’s a more detailed explanation:

https://learnopengl.com/Getting-started/Coordinate-Systems

If I’m understanding correctly, not including a vertex shader should use a default so it’s more like a fixed pipeline. That’s discussed some here under “What to do if part of a shader pair is not present?”. I usually include a vertex shader in my stuff since I have vertex attributes, but I suppose it should work without in your case for now.

1 Like

image

ok… :slight_smile:

image

kind of progress :slight_smile:

Texture coordinates are normalized values. Maybe that’s the problem here.

image

yep :slight_smile: now it just looksl ike my texture is different size to the window or something…

I don’t mean to derail your thread here, but thought this could possibly help ya:

For months I tried to avoid learning more in depth of OpenGL & then for a particular project I just couldn’t ignore it anymore. This youtube plist saved my butt. Super well done: https://www.youtube.com/watch?v=W3gAzLwfIP0&list=PLlrATfBNZ98foTJPJ_Ev03o2oq3-GGOS2 I just followed each video, working on an OpenGL JUCE App and implementing the stuff from the video. It was well worth the time investment, & actually quite a new refreshing concept to dive into :slight_smile:

3 Likes

Can’t you see I’m trying to do this without actually understanding what I’m doing :smile:

6 Likes

I totally understand :laughing: me a year ago --> OpenGL + GraphicsContext::drawImage Performance

1 Like

thanks all - up and running. 50fps at 2%cpu load :slight_smile:

2 Likes

and now with multiple layers, blurred and non-blurred:

/**
 * Base class for an object (probably a component) that'll be painting itself in
 * multiple layers.  @see ComponentBlur for a consumer of this class.
 */
class MultiLayerPainter
{
public:
    virtual void paintLayer(int layer, Graphics & g) = 0;
};
1 Like

Nice!
Curious here: how did you do it?

Paint the component to an OpenGLTexture:

  • Paint to an Image
  • Load the image as a texture

Then:

  • Draw a square made of 2 triangles using OpenGL 3.3s Vertex Array Buffer stuff
  • Render it with the texture to a framebuffer (OpenGLFrameBuffer)

Then repeat 2 stage vertical and horizontal blur over and over

Then render an OpenGLTexture unblurred as a layer on top.

Oh - and enable the magic macro that allows non-power of two textures in JUCE

Ok, I see.
Thanks!

Was that the right level of detail? :slight_smile:

Yeah, I think so.
I will try to something similar in the near future.
I’ll let you know, how it goes :slight_smile:
Thanks for sharing!

The most important knowledge was to pick a version, 3.3 I went with, and avoid tutorials and help that relates to other versions…

For some reason they like to rip everything up and start again every OpenGL version…

1 Like

Hmm… I see.
I did some OpenGL back in the days when version 2.0 was new.
I suspect, much has changed since then :slight_smile:

Did you share any source code? Would love to experiment a bit on this…