Pixel shader on 2D vector graphics

I'm trying to create a bloom effect on some 2D vector graphics. I've tried doing this myself for a while but I can't figure out how to get the drawn colors into a texture that a pixel shader can read. I'm using the OpenGLGraphicsContextCustomShader since it says it's good for per pixel effects:

"Used to create custom shaders for use with an openGL 2D rendering context.

Given a GL-based rendering context, you can write a fragment shader that applies some kind of per-pixel effect."

I could probably get what I wanted by using a OpenGLShaderProgram but it doesn't seem like the 'correct' solution for this.

Has anyone had any experience writing a shader that runs on 2D graphics like a blur, bloom, removing a color channel or something like that? Or maybe I'm just missing some obvious way I can pass a texture into the shader created by a OpenGLGraphicsContextCustomShader. I'd appreciate any help, thanks!

1 Like

Hey there, glad to see someone is on the same topic at the same time. I have a little bit experience on shaders, but on cinder :( I'm stuck right now with setting up the openGLGraphicsContextCustomShader.

Maybe if you could share your code, i could bring in my (little) experience.

Thanks in advance

well, I'd say you'd have to pass a uniform of type sampler2d to the shader program. but i haven't found a way yet to do this.

Is there actually a method for sending uniforms to OpenGLGraphicsContextCustomShader? If there is, I can't for the life of me figure it out. Has anyone done this? 

For anybody who is trying to get this working, here’s some starter code:

  1. create new “OpenGL Application” project
  2. in MainComponent.h, replace everything with the following:

    This file was auto-generated!


#pragma once

#include "../JuceLibraryCode/JuceHeader.h"

    This component lives inside our window, and this is where you should put all
    your controls and content.
class MainComponent   : public OpenGLAppComponent

    void initialise() override;
    void shutdown() override;
    void render() override;

    void paint (Graphics& g) override;
    void resized() override;

    // Your private member variables go here...
    std::unique_ptr<OpenGLGraphicsContextCustomShader> shader;

  1. in MainComponent.cpp replace everything with the following:

    This file was auto-generated!


#include "MainComponent.h"


const char* shaderCode =
    "uniform vec2 u_size;\n"
    "uniform vec2 u_mouseNorm;\n"
    "uniform float u_time_S;\n"
    "void main()\n"
    "    vec2 posNorm = vec2(pixelPos.x / u_size.x, pixelPos.y / u_size.y);\n"
    "    gl_FragColor = vec4(u_mouseNorm.x + sin(u_time_S * 2.0) * 0.5, posNorm.y + u_mouseNorm.y, u_mo"
    "useNorm.x - posNorm.x, 1.0);\n"

    // Make sure you set the size of the component after
    // you add any child components.
    setSize (800, 600);
    shader.reset(new OpenGLGraphicsContextCustomShader(shaderCode));

    // This shuts down the GL system and stops the rendering calls.

void MainComponent::initialise()
    // Initialise GL objects for rendering here.

void MainComponent::shutdown()
    // Free any GL objects created for rendering here.

void MainComponent::render()
    // This clears the context with a black background.
    OpenGLHelpers::clear (Colours::black);

    if (shader == nullptr)

    auto desktopScale = openGLContext.getRenderingScale();

    std::unique_ptr<LowLevelGraphicsContext> glContext (createOpenGLGraphicsContext (openGLContext,
        roundToInt (desktopScale * getWidth()),
        roundToInt (desktopScale * getHeight())));

    auto mouseRel = getMouseXYRelative().toFloat();

    shader->onShaderActivated = [&glContext, &mouseRel](OpenGLShaderProgram& p)
        auto bounds = glContext->getClipBounds().toFloat();

        p.setUniform("u_mouseNorm", mouseRel.x / bounds.getWidth(), mouseRel.y / bounds.getHeight());
        p.setUniform("u_size", bounds.getWidth(), bounds.getHeight());
        p.setUniform("u_time_S", static_cast<float>(Time::getMillisecondCounterHiRes() * 1.0e-3));

    shader->fillRect(*glContext, glContext->getClipBounds());

void MainComponent::paint (Graphics& /*g*/)
    // You can add your component specific drawing code here!
    // This will draw over the top of the openGL background.

void MainComponent::resized()
    // This is called when the MainComponent is resized.
    // If you add any child components, this is where you should
    // update their positions.

This should create a colorful window that responds to mouse movement & elapsing time:

I would also highly recommend going through the tutorial on https://thebookofshaders.com/. It’s super well written and highly interactive. A wonderful blend of math/engineering/art :slight_smile:


Ok I was running this on macOS Sierra fine, went to test on my Windows 10 machine. Everything seems to work fine here, but when a debugger is attached, I keep hitting a jassert() on line 168 in modules\juce_opengl\opengl\juce_OpenGLShaderProgram.cpp when the shader gets compiled.

I’m testing on the develop branch tip. It looks like the GLint cast of the attributeID member is -1. This only happens to the screenBounds attribute initialization in this chunk of code (I’m assuming for the “bundled” vertex shader):

        ShaderBase (OpenGLContext& context, const char* fragmentShader, const char* vertexShader = nullptr)
            : ShaderProgramHolder (context, fragmentShader, vertexShader),
              positionAttribute (program, "position"),
              colourAttribute (program, "colour"),
              screenBounds (program, "screenBounds")

which is line 415 in modules\juce_opengl\opengl\juce_OpenGLGraphicsContext.cpp

Here is a screenshot of the exact break:

I tried removing the setUniform calls and reduced my fragment shader code to just print out a static color, but still it hits the jassert.

@fabian or anybody else if you have a moment could you possibly take a look at my code? I’m curious why the jassert() doesn’t fire on my macOS machine, perhaps I’m overlooking something here.

Many thanks :slight_smile:

just in case you hadn’t seen it:

Thanks for pinging me to that thread! I updated my code above to use the lambda for setting the uniforms.

However I’m still running into that assertion only on Windows. Again, running the .exe without the debugger works fine so far.

Has anyone gotten this working for effects like blur, bloom, etc. mentioned in the OP? I’m having a bit of trouble getting it to access what’s already on screen (via my paint() function) and applying some shader to that

I do most of my graphics in OpenGL now.
You can paint to an OpenGLImage, then load a fragment shader and render the image onto a quad using that shader. Using OpenGL render callbacks has its own caveats though (like you can’t render OpenGL over any Comonent drawn with paint()).



I’m trying out something similar now, although my situation might be a bit different. Basically I’m trying to achieve Photoshop style blending modes like Vinnie’s layer effects (which was only for the software renderer).

I’m only at the “background layer” portion right now, so inside my GraphicsLayer class I create an OpenGLImageType to render into. Essentially it’s:

void myComponent::paint(Graphics &g)
        // creates an OpenGLImageType internally and keeps an Image reference to it
        GraphicsLayer circleLayer(g); 

        // here we draw using the new 'layer' Graphics object
        Graphics layer(circleLayer.getLowLevelContext());
        layer.fillEllipse(50, 50, 100, 100);

        // Once the GraphicsLayer is out of scope it uses g.drawImage() to 
        // put its contents back into the original Graphics context passed to
        // this function

But once I delete the context in the GraphicsLayer there’s a GL_INVALID_OPERATION deep inside the OpenGLContext code (specifically when JUCE’s shader queue tries to draw, juce_OpenGLContext.cpp line 1283). Have you encountered this issue when creating contexts to paint into from OpenGLImageType?

I thought maybe it was my internal OpenGLImageType being deleted when the data is trying to be used (since the Image is a member of that GraphicsLayer, so the ref count goes to 0) but preserving the reference count didn’t change anything.

Not sure I can help. I create a single OpenGL context and use those OpenGLRenderer render callbacks so I don’t have any experience getting a context to use in paint().

Seems like it’s the result of:

So when the original context tries to flush its shader queue the VBO/EBO bindings are 0 due to the destruction of the “layer context” I had made

After writing in a quick fix to store the previously bound VBO/EBO in ShaderQuadQueue's constructor - and restore those during the destructor - things work out fine creating the layer contexts … now I just have to fix my own bugs :sweat_smile: