OpenGLComponent deleting context


I have OpenGL Components rendering in a thread, and now when I resize them, I run into context problems. It looks like the ‘OpenGLComponentWatcher’ class is doing a context delete on the main thread when it shouldn’t.

What is that class? How do I get it to keep out of my business? It seems to do a similar job to the component, just in a separate object?



It’s just an object that gets notified when the component moves relative to its parent window, or is hidden/moved to a different window, etc., so that it can make sure the heavyweight child window is in the right place. It should only delete it if you hide and re-show the component or something (?)


I’m using make visible/non to start and stop the rendering thread, which creates and deletes the context as needed. I ran into this problem when resizing, and then added an extra set of hide/show in the resize, then I found the zero context. Either I’m hiding and triggering it, or the resize is.

Anyway, if an OpenGLComponent is used in a thread, all context creation and deletion needs to be on the thread, and this approach is inherently main thread based? I guess it should call a function in the main class that can be overriden, like the other methods that need to be overriden when using threaded OpenGLComponents?



Look at the recent changes to the OpenGLComponent / Context about the delete method.
I kind of got the same issue, and Jules added a way to sort this out (Basically, you need to delete/recreate the context yourself, and it’s possible with the additions).


Sorry, I don’t see where. I have my own context sub-classes, and deal with all my own creation and deletion. Problem is my OpenGLComponent subclasses - I’ve overriden everything set to override, but the context is being deleted ‘for me’ now. I don’t see how the context stuff would help unless I code specially to detect whether the context is attached to an OpenGLComponent?

There’s this comment for deleteContext:

[quote] This must only be called on the message thread, or will deadlock.
On background threads, call getCurrentContext()->deleteContext(), but be careful not

Now I think the watcher is calling it.

I think either the watcher should call an overrideable method, not deleteContext (which is not virtual) directly.

Or what am I missing? I seem to have to call update position so the windows still match up, i.e. the watcher does have a purpose.



What I’ve done in my code is that in the rendering thread, I’m having 3 steps (initialization, thread loop, destruction).
I’m doing initialization in the message thread (another windows ™ strangeness), but as soon as the thread initialize, it re-build the context for the thread.
In the last step, I do call the OGL stuff to clean the context.

Whenever I detect a change in size, I set up a flag in the thread, and I call the deleteContext, and rebuild a new context for the updated size.


I’m doing the same, minus the windows silliness. But now my context is disappearing (before I get to the point in my thread where I want to). The OpenGLComponentWatcher has some code to delete the context when the main component moves or changes.

Things were working fine, but now when I change the size or hide then show the app, the context clean-up fails. This has changed in the last few months.

Can you please look at what you’re overridden in your subclass? Seems like there’s some other default behaviour I need to prevent.


void OGL::freeResource(bool inOGLThread)
       // OGL code here to delete texture etc...
        // Make sure the context is deleted from now
        if (inOGLThread)
            // Currently OpenGL context should be deleted in the same thread as where it's created
            // But Win32 implementation in Juce is illogic as it's (correctly) deleting the context,
            // but it's also trying to delete the window that created the context, and this can't happen
            // on the presentation thread
            // So I've split Juce's OpenGL context code so it's possible to actually only delete
            // the context, and the message thread will later delete the remaining stuff.
            if (getCurrentContext()) getCurrentContext()->deleteContext();
        else deleteContext();

// Another thing that I'm using in my thread main loop
struct ScopeContext
    OpenGLComponent * component;
    ScopeContext(VideoComponent::VideoOutput & comp)
        component = dynamic_cast<OpenGLComponent *>(comp.getInternalComponent());
        if (component)
            if (component->getCurrentContext()) component->getCurrentContext()->setSwapInterval(1);
    void reownLostContext()
        if (component && (!component->isActiveContext()))
            if (component->getCurrentContext()) component->getCurrentContext()->setSwapInterval(1);
    ~ScopeContext() { if (component) component->makeCurrentContextInactive(); }

That won’t help you much since I don’t know what your code does exactly, but you can give you ideas.


Very similar. So - maybe I’m hitting it because of OS X, although I doubt. You don’t seem to have the OpenGL clean-up code I have. I wonder if when your component is cleaning up, the context has actually been deleted, and it’s just not a problem on Windows, and/or you’re not doing anything to reveal it.

In my case, I override visibilityChanged. On hide, I stop my thread. On show, I start the thread. The thread does:

cleanup OpenGL
delete context

The problem is that by the time I get to ‘cleanup OpenGL’ the context is gone (0 and deleted). I suspect it is for you too, it’s just not causing you a problem. I was hoping to see something obvious in your code, like you overriding some visibility or size related callback or method.

To be clear - I’m betting your context is being deleted every now and again without you realizing it too.



My case is like this:
ScopedContext (see post above)
freeResource (see post above)

As soon as the context is cleaned (whatever the cause), the ScopedContext object detect this and make the context active/ recreate it.


Would you please check if you still have a context after your loop? You may find that (depending on timing, I suppose) your context is being deleted by the component watcher.

assert (context); or similar, just before/in your free resource?



I still have a context after the loop (already have an assert here, and it work, since it used to trigger on Windows, until I’ve force recreation of the context in the thread).
The loop doesn’t manage the component (the component still exist even when the rendering thread is stopped).
The watcher doesn’t delete the context, since the component peer hasn’t changed (at least in my codebase).

The message thread usually wait until the rendering thread is done, and then delete the OGL component (but the context is gone already, free’d in freeResource)


Ok, so coming back, there’s another change that breaks any chance of re-using an OpenGLComponent to render in a thread:

When I do makeCurrentContextActive, it runs createContext, which makes a WindowGLContext, which makes an NSViewComponentInternal, which makes a ComponentWatcher, which etc. etc

and then an assert fails on addCoponentListener because it’s not in the MessageManager thread (right, I know).

So, Jules, is this sort of thing something you’d like to handle, or should we just say that OpenGLComponents are no longer threadable (i.e. not usable for anything other than a demo)?

Currently, I’d have to make (and keep updated) a completely parallel set of context handling - I did that on Linux because of X threads, and having three (4, 5 with Android?) sets of platform specific code that’s 90% identical to code already in juce is pretty asinine.



If you can suggest anything I could do to help, I’d certainly like to do it - I don’t want to limit the usefulness of juce’s openGL support!


Great. So it seems that the newer ComponentWatcher approach clashes a bit with the original technique of ‘override these functions and you’re safe to render when you want’.

Since all the pieces exist in juce - pixel formats, OpenGLContexts, Threads, which one sounds more workable:

An OpenGLComponent subclass that uses a thread.
The OpenGLComponent to have a switch to render in a thread (something like Arrays, where you have an option).

The first approach would actually need to be two subclasses (threaded and non), since there’s a lot of stuff that needs to not happen in the normal flow of things. The second approach would also be very nifty if the thread could be switched on and off. I would personally do that to drop to a lower performance when the user is doing things, then ramp up.

I suppose the other or is to continue with the ‘overrideable’ system, but that didn’t work on Linux (and threads) anyway.

Caveats with either: other components (except on the desktop won’t render on top properly).



Alternatively, am I overreacting? It’s happened before.

If the OpenGLComponent takes a MessageManagerLock when it creates and deletes a context (so, even if it’s on a thread it covers other stuff that has to happen) it would cover most cases…

I suppose then, back to the original problem - the ComponentWatcher deleting the context. That won’t fly, since it could do the deletion on the main thread - it also doesn’t leave an option to clean-up OpenGl objects nicely (also breaks RAII, since the component no context tends to hold indexes of OGL items).

A callback when the context should be deleted? A threaded version would start to delete the thread (thread close would clean-up and delete context), a standard implementation can clean up OpenGL textures etc.



I don’t get the issue you’re encoutering. I’m doing all the OGL rendering in a thread, and never encountered any issue with both Linux and Windows.
The only thing I could think of, is to actually create the first context on the message thread, since if you avoid this, it breaks on Windows.
Then, I just make sure that when the component is deleted, the thread is stopped, it’s quite easy, put a thread.stopThread() in your destructor, then you can acquire a the context for a last time (in your destructor), and perform the OGL’s deletions (like texture, and other) in that context, that is, the message thread’s stack.
I wonder why you need to start / stop the thread on visibility change, as the overhead to create a thread is a 100’d of time larger than to simply wait for an event in the main loop to avoid doing any work if not visible.


My initial thoughts (which may be naive, as I’ve never done any serious openGL programming myself…) would be to have an abstract class called “OpenGLRenderer”, which just has a pure virtual “render” method, so you’d implement one of those. Then you’d give it to the OpenGLComponent, with a switch to tell it to call it on a dedicated thread, or just during the normal paint callback. Does that sound sensible?


You would need pure “initialize” and “clean” method too, since Microsoft added some very stupid limitation, like context / texture / vertex array / PBO creation and deletion must be on the same thread, and for some, it must be on the message thread.


so something like this?

[code]class OpenGLRenderer
OpenGLRenderer() {}
virtual ~OpenGLRenderer() {}

virtual void initialiseOpenGL() = 0;
virtual void shutdownOpenGL() = 0;
virtual void renderOpenGL() = 0;