SoftwareRenderer not being pixel perfect on different platforms

I’m not sure if this issue is because of the PNG format being saved weirdly on different platforms or if this is a JUCE difference in the renderer, but a rendering test of the JUCE software renderer fails (different platforms produce different results):

test_draw_rounded_rect_reference
Above image is generated on macOS

test_draw_rounded_rect_generated
Above image is on Windows

test_draw_rounded_rect_diff
This is the diff (mean absolute error)

As you can see, there are differences in the top left corner (outer rounded rect). Is this behaviour normal ?

1 Like

I pasted the two images into photoshop, set the top layer to “difference”, result looks completely black, even when boosting the levels massively…

Using your eye is not a good way to determine pixel perfection. The diff in my image show Mean Absolute Error with 0 threshold.

That said, i’m not sure if this is a JUCE error or a quantization error of the png on different platforms.

I didn’t trust my eyes, I boosted the levels. To be fair, photoshop apparently rounded somewhere, red is indeed off by one. in your diff image it looked like a lot more.

Can you clarify how you arrive at the different images? Are you loading pngs and displaying those, or are you stroking the lines? Is the error present when displaying the images in a JUCE application, or only when you compare pngs saved using the JUCE API? There are many unknowns at play here.

My image is generated using the absolute difference of the two images with floating point channels scaled by the mean absolute error, roughly (without clamping and rescaling back to uint8):

diff_image = numpy.abs(image1 - image2) / mse

Both images are generated using the JUCE api and saved to png using the juce::PNGImageFormat, then they are loaded back using imageio and compared using numpy, where i produce the diff image in the case the mean absolute error is greater than 0, and fail the rendering test.

Is a difference of 0.003921568627451% for two of the three color channels for four pixels (out of 160,000 = 0.000025%) on two different platforms, using different compiler versions, really worth investigating?

3 Likes

Maybe not, but JUCE used to be pixel perfect on all the platforms. These kind of slight differences might be hiding bigger issues, and having a non exact software renderer on different platforms will make it difficult to write extensive and working rendering tests to ensure no graphics breakage is introduced.

We have been using rendering tests for years in our in house renderer, and ensuring pixel perfection on all platforms and even across graphics stacks (Software, Metal, OpenGLES, OpenGL) made sure we don’t break something or introduce glitches at every release. Over the years, it helped making huge restructurings and refactors in the graphics stack (as we gradually introduced our low level rendering abstraction) ensuring a smooth sailing.

I would instead add a certain tolerance to your tests, so deviations of, e.g., 1% are allowed. No user will ever notice or complain about such minuscule differences.

You say JUCE used to be pixel-perfect, but now, suddenly, it isn’t. The differences might be due to compiler and compiler-setting differences, as I’m unaware of any code changes to path rendering in the last few years.

Apple Clang is on version 15.x, but for Windows, the current version is 17.x. It’s not unreasonable to assume that some compiler-related optimizations work slightly differently now.

1 Like

In my experience JUCE never has been pixel perfect. Unless you use the same compiler and renderer on all platforms. Font outline generation/rendering handled by the platforms produce slightly different results. Path rendering handled by CoreGraphics on macOS and iOS also potentially renders slightly different than the software renderer or the openGL renderer or the upcoming direct2d renderer. As long as the differences are small enough I don’t care. Although this obviously makes running automated render tests across platforms difficult - but maybe that’s not really necessary? Just do a test per platform against a reference image created on the same platform.

2 Likes

In my case i’m forcing the graphics class to take a LowLevelGraphicsSoftwareRenderer just to be sure i’m testing the same renderer in all platforms.

Font rendering is a different story, not all backends use freetype in juce.

I will probably try to go down the route of having reference images per platform instead.

I think you’re spending more time testing JUCE code than your own product code

1 Like

Oh, I didn’t realize we were running a charity for JUCE development. Thanks for the reminder.

Can your rigorously tested products be seen/demoed somewhere?
The link in your profile does not work.
Just curious

1 Like

You probably already played it once in your lifetime.

Fast math is enabled ?

Interesting idea, but no, unless it’s set by juce::juce_recommended_config_flags

IMO this is the right thing to do - I wouldn’t want someone in my codebase to implement a workaround for a bug they haven’t bothered to report only for us to have to pointlessly maintain that workaround for ever after. I can think of a few cases where “workarounds” have become so intertwined with the rest of the codebase that even when the things they were working around have been fixed, we can’t remove the workaround for fear of breaking things.

I also think this is the right thing. In the age of 4k retina displays the need to be pixel-perfect, especially for things like anti-aliasing, will only hold back improvements. The OpenGL renderer for example could be so much more performant if it didn’t try to be pixel-perfect.

2 Likes

I can think of a few cases where “workarounds” have become so intertwined with the rest of the codebase that even when the things they were working around have been fixed, we can’t remove the workaround for fear of breaking things.

I agree, and unless the OP is planning to fork JUCE to implement a fix themselves, I doubt that there’s much they can do. Reporting it as a bug isn’t likely to elicit much more of a response than it’s already gotten, since the JUCE team is already bogged down with the upcoming JUCE 8 release.

If GUI features and absolute 100% pixel-perfect drawing on every platform was my #1 requirement for a product, I would probably investigate GUI frameworks other than JUCE :man_shrugging:

I don’t play games, sorry

Leaving the discussion aside if the test is worthwhile to pursue, here’s an idea what might have led to the artifacts:

The placement of the image displayed is in integer pixels. Since those are logical pixels and the factor is a floating point, I would be surprised not to see those differences.

2 Likes