[macOS] Colo(u)rSpace and OpenGL

Rendering with JUCE on macOS will produce different colours on OpenGL vs other renderers.


Why is this a mac issue?

Well… macOS does handle colour space conversions but they do it pretty much everywhere except OpenGL

Sound Radix made the JUCE OpenGL renderer ‘mess’ with the juce::Colour just before uploading it to OpenGL to overcome this difference.

You can try it out here:

Pros:

  • CoreGraphics and OpenGL should have equivalent colours.
    (minor differences could be due to rounding errors, anti-aliasing premultiplying or text renderer).

Cons:

  • Uploading juce::Image does takes a little longer as conversion is done before uploading texture.
    But your graphics designer won’t complain about you not using his colours palette! :slight_smile:
2 Likes

Cool that you found a solution and managed to get something to work! But I’m struggling to understand something; OpenGL provides sRGB buffer and texture controls, so why not use that? (eg: https://community.khronos.org/t/setting-up-srgb/75216 )

1 Like

If there’s a flag you can just set somewhere and it works that of course would be a better solution! Searching GLFW’s sources it looks like the flag is only used with EGL which iiuc is for Linux? So no chant found there to enable sRGB on macOS…

I’ll add on @yairadix feedback.

a. Our goal is to have the exact same visual colour… that is also the result of transformation between the sRGB to the display’s ColourSpace which is also affected by the ColorSync profile. So what you see (on all platforms) is sRGB after additional framebuffer processing. on macOS, OpenGL is the only exception for that. (iOS does things differently)

b. OpenGL never supported the concept of colour space. RGB isn’t defined with specific colour space on OpenGL (also JUCE…) and sRGB extensions aren’t reliable (or include macOS additional colour transformations).

To be clear - I’m aware, though readers might not!

No, that’s wrong. Colour space can be framebuffer and texture dependent, precisely as the Khronos post I shared above covers in a short and sweet way.

More simply; when you create textures you can change the colour space by specifying GL_SRGB8 or GL_SRGB8_ALPHA8.

When you create a framebuffer, you specify GL_FRAMEBUFFER_SRGB.

If neither of these options are available due to driver weirdness or whatever, IMO the conversion on your textures should happen in a shader because that would be much more performant. It’s much, much better to leverage the GPU while in GPU-land instead of trying to pacify it by pixel mashing on the CPU beforehand. Doing this kind of work on the CPU is really missing the point of using a graphics pipeline.

If it helps any, and if you have to result in the latter, someone wrote up a quick ShaderToy example: Shader - Shadertoy BETA

1 Like

I had to circle back to JUCE’s approach to OpenGL rendering and remembered something important: if you’re using Graphics to do all of the drawing work, then all of your graphics (Paths, Images, text, etc…) get pixel-mashed into an EdgeTable and so on, eventually getting splatted onto a singular OpenGL frame buffer.

This frame buffer is where the linear->sRGB conversion should happen. If this can possibly be done via the flags I mentioned - great. If not, then I really suggest just loading up a small pixel linear to sRGB shader up and applying it to the whole frame buffer that JUCE is controlling under the hood.

If you’re brave enough to use your own renderer based on what JUCE provides, I think the same method applies.

(Note that this post assumes you’re rendering everything entirely in OpenGL.)

That’s correct. we’ve evaluated uploading the transformation LUT and applying conversion on shaders but eventually went on simpler approach which is performant enough from our tests with less code. If anyone has some issues to report, optimizations or PRs it feel free to share those. :slight_smile:

If it helps any, and if you have to result in the latter, someone wrote up a quick ShaderToy example: Shader - Shadertoy BETA

In our first iteration we tried this approach. It resulted wrong colours when looking at screenshot by designer / comparing with Apple’s Digital Color Meter.

Again, on Windows, Linux and iOS we get the same colours switching between renderers so this is macOS issue only.

I’d expect that to result in “chunkier” gradients

You shouldn’t have any issues with banding as long as the whole pipeline is configured with sRGB from the textures up to the framebuffer. Worst case is that you apply a minor dither.

1 Like

Hi, can you point where in the code of Juce this is done? I will like to color correct the gamma of the image when using OpenGL

It is in several places and commits.
You can find if via blame as we had many fixes and it’s running stable on our end for a while now.
That might be a nice starting point.

do you think this could be used with an image loaded with stb_image?

I’m using now SDL2 with opengl on mac

I’m not familiar with stb_image.
I’ve search it on the web and found some header.
Seem like it’s only decoding image into pixelbuffer.

The moment you’ve got a plain colour pixels you can easily wrap it up for juce::Image and use the JUCE Renderer (which also got an OpenGL backend).

Our Colour space fixes done only on the OpenGL render (but you can easily use some of the wrapped macOS colour conversion utils to pass adjusted colours (or images if needed).

Though our goal was to get the same colours we get with CoreGraphics while benefiting the JUCE OpenGL renderer.

But do you think this could be extracted in an small function? I can’t include the whole Juce renderer

We basically wrapped Apple’s CG conversions which was quickest (another approach is create transfer map which was out of our scope needs).

I’m trying to extract this, but I’m struggling because they depend on too much classes of Juce and also I don’t know Objective-C

IIUC this is now supposed to be solved by @reuk in main JUCE!

If it works then we’ll remove our own changes implementing this in SR’s branch

3 Likes