Uninitialized memory in OpenGLTexture::create()

We had some problems in the past with juce::Images showing artifacts at the borders when rendered with the OpenGL backend but not with the software rasterizer.

I just looked into this and this is an issue when juce::OpenGLTextures are being created for those images to be rendered.

If the width and/or height of the image is not a power of 2 the texture gets created with power-of-2 dimensions and a smaller “sub image” is used to copy the image onto the texture. However the texture is not cleared beforehand and contains uninitialized memory which will show up as rendering artifacts depending on scaling and texel-mapping.

I fixed the issue in our local fork by changing parts of juce::OpenGLTexture::create to look like this:

if (width != w || height != h)
{
    const int pixelSize = internalformat == GL_ALPHA ? 1 : 4;
    HeapBlock<GLubyte> clearData ( width * height * pixelSize, true );

    glTexImage2D (GL_TEXTURE_2D, 0, internalformat,
                  width, height, 0, type, GL_UNSIGNED_BYTE, clearData.get());

    glTexSubImage2D (GL_TEXTURE_2D, 0, 0, topLeft ? (height - h) : 0, w, h,
                     type, GL_UNSIGNED_BYTE, pixels);
}

I think this or something similar should be added to the JUCE/develop branch.

1 Like

so similar to this ?

I’m not sure. The issue we are having is only related to drawing juce::Images that are not power-of-2. The issue you mentioned reads a bit different to me.