GLSL seems broken on Ubuntu 18.04

It seems this change broke GLSL (via juce::OpenGLShaderProgram) on Ubuntu 18.04. AFAICS that OS only supports GLSL version 1.30; since the above change, the version is always specified as 1.50, where before it was left unspecified when JUCE_OPENGL3 was left undefined.

I guess it makes sense to make this an overrideable setting? (and, maybe also lower the default; Ubuntu 18.04 is still in common use).

p.s. the comment above the linked diff also seems out of date.

3 Likes

Thanks for reporting this issue. I installed a copy of Ubuntu 18.04 in a VM and wasn’t able to repro the issue (the OpenGLDemo reports a GLSL version of 3.30), so I guess this depends on the hardware/driver being used.

Would you be able to test out this patch on your machine and check if it resolves the problem? Thanks!

opengl.patch (4.8 KB)

2 Likes

After applying the patch, there’s now a more subtle error: OpenGLShaderProgram::getLanguageVersion() returns 1.3 which is less than 1.5, which means OpenGLHelpers::getGLSLVersionString() returns an empty string so no #version prefix gets prepended to shader code, which means the implicit version is 1.1; at the same time, since 1.3 is more than 1.2, the string conversions in the OpenGLHelpers::translateXXXShaderToV3() functions take place, resulting in GLSL code which is incompatible with version 1.1 and error messages such as

error: 'in' qualifier in declaration of 'position' only valid for function parameters in GLSL 1.10

Thanks for letting me know. Were you previously enabling JUCE_OPENGL3=1 on this platform? In that case, it looks like we would have run into a similar problem, unless I’m missing something.

I found a way to force my VM to use a GL 3.1 / GLSL 1.4 context, and the following patch seems to work there, and also on my mac using both a 2.1 and a 4.1 context. Could you try this one out on your machine please?

opengl.patch (7.2 KB)

1 Like

I was not.

Works OK with this patch. Thanks!

1 Like

Thanks for trying that out, the change is now on develop:

2 Likes

Upgrading my OpenGL code and ran into this.

Is it correct to be hard-coding #version 150 for every GL version?
According to this page, there are quite specific SL versions for each GL version.
https://titanwolf.org/Network/Articles/Article?AID=2b6dd53f-046d-46b1-b325-ed823a60fce6

Here’s my edited getGLSLVersionString() if it’s at all helpful:
This fixes my issues on Mac (MBP16 intel FWIW)

String OpenGLHelpers::getGLSLVersionString()
{
    auto glVers = getOpenGLVersion();
    
#if JUCE_OPENGL_ES
    if      (glVers.major < 3)  return "#version 100 es";
    else                        return "#version 300 es";
#endif
    
    if      (glVers == Version (2, 0))  return "#version 110";
    else if (glVers == Version (2, 1))  return "#version 120";
    else if (glVers == Version (3, 0))  return "#version 130";
    else if (glVers == Version (3, 1))  return "#version 140";
    else if (glVers == Version (3, 2))  return "#version 150";
    else if (glVers >= Version (3, 3))
    {
        //e.g GL 4.1  = 410
        return "#version " + String(glVers.major) + String(glVers.minor) + "0";
    }
    
    return "#version 110";
}
1 Like

I think the idea here is that users can write GLSL compatible with #version 110 by default, and then use the translateVertexShaderToV3 and translateFragmentShaderToV3 functions to translate the shaders where necessary to use the newer style. I’m not sure whether a more fine-grained approach to the version numbers is useful. AFAIK every OpenGL version after 3.2 should support GLSL #version 150, although it’s tricky to find any definitive documentation on this topic.

To give a more complete answer, it would be useful to know exactly what problems you encountered. For example, if you want to use features that are specific to GLSL 4.1, then using #version 150 definitely won’t work for you. At the same time, JUCE’s context creation won’t be guaranteed to do the correct thing either, because it can only explicitly request an OpenGL 3.2 context, which isn’t required to support GLSL 4.1.

Yes, that’s exactly the situation - we need 4.1 features and were generating a script at runtime using the reported GLSL version string. So openGL threw compilation errors.

TBH I’ve inherited this code, but it looks like we’ve customised the juce GL classes to allow 4.1 . We’ve added OpenGLContext::openGL4_1 and in the platform-native GL headers we’ve added support as well (e.g. in juce_OpenGL_osx.h: createAttribs() can now add NSOpenGLProfileVersion4_1Core.)

Is there a reason Juce doesn’t support anything beyond 3.2?

The majority of the OpenGL support was added quite a while ago, so perhaps later standards hadn’t been widely adopted at that point. I think it would be safe to add newer standards now, so the main blocker is just the amount of time available to the team. I’ll make a note to revisit this, although this work will need to be appropriately prioritised against the other pending feature requests and bugfixes.

Well, here’s my vote for supporting recent openGL versions. JUCE has done a fair bit of work on openGL recently so it would be a shame to have it hobbled in any way.

IIRC, the issues we had with openGL 3 were performance related (on certain Macs) - 4.1 ran the same scripts much more efficiently.

1 Like

More recent GL versions can now be enabled on the develop branch: