Gradient dithering?

hi,

I just wanted to know if anyone here already tried using Gradient dithering with juce to avoid banding?

It’s specially visible with dark tones and subtle gradients where the number of of intermediary colours are limited see attached image.

1 Like

no one?

I could understand the use of dithering for an image-manipulation application (i.e. Photoshop or Gimp). But for user interface elements, which is what the juce::Graphics is designed for, what would be the point of dithering? If you can see the banding then your display has the wrong gamma.

If you really must have dithering for linear blends, I suppose you could implement it yourself by locking the underlying Image for write access and then applying your own routines to it.

This has been on my list to fix for ages.  I've got this far (see little shader below) which does a lovely smooth job, try in in the 'OpenGL 2D' JuceDemo code editor.  


void main()
{
     vec4 colour1 = vec4 (0.2, 0.2, 0.2, 1.0); 
     vec4 colour2 = vec4 (0.05, 0.05, 0.05, 1.0);
     float alpha = float(pixelPos.x) / 900.0;
    int indexX = int(floor(mod(float(pixelPos.x), 3.0))); 
    int indexY = int(floor(mod(float(pixelPos.y), 3.0))); 
    int index = indexX + indexY * 3; 
    int[9] a = { 3,7,4,6,1,9,2,8,5 }; // see Ordered Dithering on Wikipedia and elsewhere.
    vec4 targetColour = mix(colour1, colour2, alpha); 
    /* Do the dithering only when y > 200 so we can see a before and after. */
    if (pixelPos.y > 200) 
    {
        targetColour += float(a[index]) / 600.0;
        targetColour *= 0.95; // compensate for the slight increase in brightness. 
        targetColour.a = 1.0; // nail the alpha channel to 1.0
    }
    gl_FragColor = pixelAlpha * targetColour;
}

What was Vinnie getting at with 'your display has the wrong gamma'? I see banding on a range of displays when doing gradient fills over large areas using small transitions in colour.  Is this not typical?

There's probably a more efficent way of doing the OpenGL version (maybe using a texture map would avoid all those mod's and type conversions).

Anyway, results count.  Here's the output from shader.  You'll see the banding on the top half of the image, and the lovely banding free version below with a bit of luck. 

Jules - it'd be really nice to have a dither option on gradient fills that working with all the contexts, not just OpenGL.  Is that likely to ever happen or should I roll something myself?

1 Like

I have to admit: It took me a while to see the banding in your screenshot.

But once I found it: well, the dithering might not be a "game changer" but it does indeed look nicer.

Once you've seen it there's no going back! ;-)

I have a case where I'm doing a gentle background gradient fill on a desktop app and it's really offensive though ... 

1 Like

Cool stuff! Interesting.

I guess the overhead of dithering in the GPU is pretty reasonable, though it'd be quite heavy to do in the CPU renderer.. I guess it's something that could be enabled only in cases where the gradient is very shallow though. Not a high priority for us right now but if you had some almost-ready-to-use code that took care of this we wouldn't be opposed to taking it on ;)

I'll have a peek ... the LowLevelGraphicsContext code needs a user guide though ... :) 

However I think it might be okay even in a software renderer.  

Really all it's doing for each pixel in a given row (given a 4x4 matrix to make the MOD easy instead of the 3x3 one i used to reduce typing in my testing):

(In pseudo asm, where X is a register with the xposition, C a reg with the colour value)

  1. AND X, 0b11 // mod
  2. MOV R1, matrix[X] // load adjustment
  3. FADD C, R1 // sum ... needs to be done for red/green and blue unless it can be done in parallel.

Everything else I think can be precomputed per-row or per fill. 

Does that look like it'd hurt much?  I'll see if I can find my way to the right bit in your source ...

Oh I see what the problem is ... it'd be ok for the software rendering and the OpenGL context, but the others would be slow/awkward..

Here's a directly usable piece of JUCE C++ code.  It might be more useful that my 'here's a shader' effort! 

I'd be interested to know how to modify it so that it takes, say, a ColourGradient or maybe something simpler as a parameter?  It's a while since I did any OpenGL and I've got a bit rusty :) 

class OpenGLDitheredBackground
{
public:
    void apply(LowLevelGraphicsContext & context, const Rectangle & area)
    {
        auto result = customShader.checkCompilation(context);
        jassert(result.ok());
        customShader.fillRect(context, area);
    }
    
private:
    const String code =
    "void main()"  "\n"
    "{" "\n"
    
    /**
     Set your colours here: 
     */
    "    vec4 colour1 = vec4 (0.2, 0.2, 0.2, 1.0); " "\n"
    "    vec4 colour2 = vec4 (0.05, 0.05, 0.05, 1.0);" "\n"
    
    /** 
     Modify this next line to have your gradient fill happen in a different
     direction. 
     */
    "    float alpha = float(pixelPos.y) / 200.0;" "\n"
    
    "    int indexX = int(floor(mod(float(pixelPos.x), 3.0))); " "\n"
    "    int indexY = int(floor(mod(float(pixelPos.y), 3.0))); " "\n"
    "    int index = indexX + indexY * 3; " "\n"
    "    int[9] a = int[](3,7,4,6,1,9,2,8,5); // see Ordered Dithering on Wikipedia and elsewhere." "\n"
    "    vec4 targetColour = mix(colour1, colour2, alpha); " "\n"
 "\n"
    "    targetColour += float(a[index]) / 600.0;" "\n"
    "    targetColour *= 0.95; // compensate for the slight increase in brightness. " "\n"
    "    targetColour.a = 1.0; // nail the alpha channel to 1.0" "\n"
    "    gl_FragColor = pixelAlpha * targetColour;" "\n"
 "\n"
    "}";

    OpenGLGraphicsContextCustomShader customShader {code};
};

 

Well 255 steps between black and white is not that much after all, and most people can see the difference between two subsequent steps. As a test, try making a 10,000 px wide gradient from #000000 to #FFFFFF. On a high quality monitor you'll be able to see the banding in the entire image.

I have found that even a simple 2×2 regular dither is enough to avoid visible banding.

If you want to know if 1 step is really visible, you can try to make out the slightly darker rectangle below:


I can in that example, but only after I cleaned my glasses ... :)  The banding on the gradients was more obvious to me ... maybe because it's a repeated pattern