Gradients. How?

EDIT: just to be clear, JUCE gradients produce colour banding. I am looking for a way around this.

Just looking for some confirmation here. If I want to use gradients in a GUI design, I have 3 options:

  1. Write a 1x1 px fillRect loop to manually apply dither, which is so costly that I’d have to write this to an image file at resized() and draw from that in paint.

  2. Use OpenGL and shaders to implement dither there.

  3. Prerender png files from figma/photoshop and have png based gui.

All options are a lot of work for such a basic thing. Can anyone here tell me I’m missing something simple to get a gradient going without horrific colour banding issues?

Use the ColourGradient class, then graphics.setGradientFill()

Not an option, colour banding results due to no internal dithering.

Probably easiest to create an ImageEffectFilter to do the dithering for you.
So draw the gradient to an Image, implement a simple dither image effect and apply it to the image and draw it to the Graphics context.
https://docs.juce.com/master/classImageEffectFilter.html

Here is a dither shader for reference:

But of course having some dither options for Gradient drawing in JUCE itself would be very convenient.

And of course the Image only has to be redrawn when resized and when you are using Gradients that is not a bad idea anyway.

Would you mind expanding upon this a bit, or providing an example of the issue? I’m asking because I’m using gradients extremely extensively (including animating them) and I’m curious if I’m running into the same issue.

Is it mainly with subtler gradients over larger areas? Is it mitigated at all by adding more stops? Tangent, but I’ve had good results using the HSLUV color space to manually add more stops when the gradient is changing hues.

This is a pretty good example of with and without dither: Shader - Shadertoy BETA

But yes, it is worse when you have a smaller change in target over a larger number of pixels. Essentially banding is caused by not enough enough shades/colours to cover the distance, so rather than 1 shade per pixel, you might have 20 pixels all using the same shade. Dither is used to get around this by placing noise to disperse the error around and trick the eye into seeing a smooth transition, as if there were more colours available to fill the distance.

1 Like

Sorry, should have asked if you had a JUCE example… I’m familiar with color banding and dithering (your description is great though), but haven’t seen issues in JUCE and can’t seem to easily reproduce with linear gradients. Is it more visible with repeating or animated gradients?

1 Like

Banding is especially noticeable if you have a gradient of different tints of one color. For testing i always use the color green. Just create a fullscreen gradient that goes from dark to light green and you’ll see what we mean. You see banding because with 8 bits you only have 256 steps, and there are much more pixels. Dither will make the steps between the different much less noticeable. I know motion graphic designers always use dither when using gradients.
For our VJ software (Resolume) we offer dithering for gradients, often the LED screens or projectors used have a hard time when using gradients so dithering helps a lot.

I can’t seem to reproduce any banding on JUCE 7 macOS (even with big fields of green). Perhaps the underlying CoreGraphics CGContextDrawLinearGradient implementation is doing some dithering?..Looks noisy when I zoom in.

1 Like

I think on OSX it might even use a higher color depth. I was trying to provide an example and was not able to get banding to show on my Macbook Pro.
But you’ll see banding when drawing to an Image and drawing that to screen.

void MainComponent::paint(juce::Graphics& g)
{
    juce::Image image(juce::Image::ARGB, getWidth(), getHeight(), true);
    juce::Graphics tg(image);
    
    juce::Colour green(0, 255, 0);
    juce::ColourGradient cg = juce::ColourGradient::horizontal(green.darker(1.0), 0.0, green.darker(20.0), getWidth());
    tg.setGradientFill(cg);
    tg.fillAll();
    
    g.drawImage(image, getLocalBounds().toFloat());
    
}

JUCE gradients produce colour banding

On some platforms. In macOS the underlying CoreGraphics engine applies dithering.

  1. Use OpenGL and shaders to implement dither there.

If you want, this is actually already available in SR’s JUCE branches because we also wanted gradients to look good when using the OpenGL backend

3 Likes

Banding is also caused by a display not properly displaying all colours. I have different displays and the green gradient in the example above looks almost pefect on one, but bands HORRIBLY on the other (which is much newer, by the way). Both displays should of course be able to properly display 8 bits for each channel, but one clearly fails at that. Also note that I’m on OSX and I see the banding regardless of the display’s colour profile. The CoreGraphics dithering is of no help here.

This is important to bear in mind because if you have a good display you might not be not seeing some very ugly banding which many of your users will!

I’ve found that simply applying some noise makes gradients look great. It’s slow, so I apply the noise to an image which gets redrawn only when resized() is called, but the result is a huge improvement.

3 Likes

@aamf - how might one apply noise in Juce to a gradient?, Or, if I understand you correctly, to the image drawn with a gradient?

I’m just applying it to the image drawn with a gradient. Here’s a very simple example which adds noise just to the brightness. It’s very slow but since I’m doing this just once it’s fine for me.

inline void addNoiseToImage (Image& img, float amount) 
{
    int h = img.getHeight();
    int w = img.getWidth();
    const float shift = amount * 0.5f;
    
    for (int y = 0; y < h ; ++y) 
    {
        for (int x = 0; x < w ; ++x) 
        {
            auto oldColour = img.getPixelAt (x, y);
            auto newBrightness = oldColour.getBrightness() + Random::getSystemRandom().nextFloat() * amount - shift;
            auto newColour = oldColour.withBrightness (newBrightness);
            img.setPixelAt (x, y, newColour);
        }
    }
}

A noise amount of about 0.02f works fine with the example above.

2 Likes

You can speed up the addNoiseToImage in a lot of ways but the easiest is to use the BitmapData class yourself instead of the Image::setPixelAt which wraps that call for you.

inline void addNoiseToImage2 (juce::Image& img, float amount)
{
    const juce::Image::BitmapData bmd (img, juce::Image::BitmapData::readWrite);
    const int h = img.getHeight();
    const int w = img.getWidth();
    const float shift = amount * 0.5f;
    
    for (int y = 0; y < h ; ++y)
    {
        for (int x = 0; x < w ; ++x)
        {
            auto oldColour = bmd.getPixelColour(x, y);
            auto newBrightness = oldColour.getBrightness() + juce::Random::getSystemRandom().nextFloat() * amount - shift;
            auto newColour = oldColour.withBrightness (newBrightness);
            bmd.setPixelColour(x, y, newColour);
        }
    }
}
3 Likes

that might looks great, but isn’t the whole sense of dithering to apply noise before the quantisation to 8bit? Imho the dithering has to happen at the color mixing-stage of the gradient draw routine.

1 Like

Correct me if I’m wrong, but applying noise to an image after the gradient is different to applying dither before the gradient. This is a photoshop technique I’ve used in the past to create smoother gradients in printed materials. You apply a very small amount of gaussian noise to the already drawn gradient and it achieves a similar effect to dither of smoothing the banding.

Yes, that’s right, but we also have to deal with displays which cannot display those 8 bits properly, so even if dithering is being done correctly (as in OSX, presumably) gradients can look crappy.

I think this is a bit beside the point. If the screen is bad then nothing can be done about that, nor should it. If you are making a gui for a specific display such as on an embedded system or something, then it’s fair enough to leave that specific implementation up to the developer. Otherwise, we can look at this in isolation of the values of the pixels on the code side, wether or not the display device can display the value.

However, JUCE is a cross platform framework that offers GUI features. For gradients to not implement a consistent means of dithering in 2022 is overlooking a very basic design technique. Then you add glow/dropshadow issues, the half put together tacked on OpenGL implementation where nothing in a paint routine actually generates geometry to be rendered by the gpu and JUCE starts to look very weak as a GUI framework.

There’s an old post from Jules on here where he basically frowns upon png based gui’s and says vector based, code rendered gui’s is the superior thing. Yet here we are, reverting to png for a gradient that doesn’t band.

Please, please, please start taking the gui performance stuff seriously. Or provide a way for us to easily offload the window contents to another framework (flutter or something). I can’t be the only one who has to constantly dumb down and scrap pretty basic elements to suit JUCE.

Sorry to go on a rant, just getting frustrated that this stuff is still happening after so many years of complaints.

3 Likes