OpenGLComponent::getCurrentContextComponent()

Jules, if you’re really going for it, could I please request the features that would make me not need to use anything else?

An abstract base class that’s an OpenGL context holder (ABC_GL)
ABC_GL would hold listeners to sharers, and tell them when context changes - esp screen
Switch drawable function - to bind in FBOs, PBuffers

Concrete classes
Offscreen context - no drawable, or an FBO drawable, used often to render and share textures (maybe an array of FBOS? I have a class)
Fullscreen
Window
Component
Peer
all should be fully sub-classable - current component isn’t so much.

That would be a darn good start. Some texture helpers later down the road maybe. A great helper would be component (snapshot) to texture.

Thanks for listening!

Bruce

Ok… This is turning into a bigger redesign than I planned! I’ll play around with the classes and see what I can come up with…

Ok folks, I’ve just checked-in my opengl changes. I’ve re-organised it using a virtual base class for contexts. Take a look and tell me what’s missing!

Thanks for that Jules, it looks much more flexible now !

Here are the missing things :

. Setting the pixel format without re-creating the window fails on the messy win32 implementation,
here is what msdn says :
“Setting the pixel format of a window more than once can lead to significant complications for the Window Manager and for multithread applications, so it is not allowed” (…)
Here is a not-completly updated version of my test application if you want to try.

. When writing a multi threaded app, we need to initialise the context in the opengl worker thread, OpenGLContext::createContextForWindow() does the initialisation, but we need a way to use it in an OpenGLComponent after, it would be nice to have it as an optional parameter in the constructor for example.
Or maybe I missed something ?

. OpenGLPixelFormat::getAvailablePixelFormats() uses a temporary Component to find which pixel formats are available. However it will not work with multi-gpu systems cause you may not have the same extensions or pixel formats on a window that is positioned on a secondary display plugged to another graphic card. The component used should be a parameter to the function, or at least the position of this component.

. there are no way to get the current thread’s currently active context, it should be a simple addition (see the first post in this thread)

. Another simple and effective addition would be the ‘swap interval’ control aka ‘vertical synchronization’, which tells the graphic card to wait for the next screen’s refresh period to draw the next frame. It avoids dirty artifacts on the screen.

on win32 :

[code]void setSwapInterval(int swapInt)
{
StringArray wglExtensions;
getWglExtensions(oc->dc, wglExtensions);
PFNWGLSWAPINTERVALEXTPROC wglSwapIntervalEXT = NULL;
if (wglExtensions.contains(“WGL_EXT_swap_control”) &&
WGL_EXT_FUNCTION_INIT(PFNWGLSWAPINTERVALEXTPROC, wglSwapIntervalEXT))
wglSwapIntervalEXT(swapInt) == TRUE;
}

int getSwapInterval()
{
StringArray wglExtensions;
getWglExtensions(oc->dc, wglExtensions);
PFNWGLGETSWAPINTERVALEXTPROC wglGetSwapIntervalEXT = NULL;
if (wglExtensions.contains(“WGL_EXT_swap_control”) &&
WGL_EXT_FUNCTION_INIT(PFNWGLGETSWAPINTERVALEXTPROC, wglGetSwapIntervalEXT))
{
return wglGetSwapIntervalEXT();
}
return 0;
}[/code]

on mac it should be :

[code]void setSwapInterval(int swapInt)
{
aglSetInteger (renderContext, AGL_SWAP_INTERVAL, &swapInt);
}

int getSwapInterval()
{
GLint swapInt;
aglGetInteger(renderContext, AGL_SWAP_INTERVAL, &swapInt);
return swapInt
}[/code]

… linux anyone ?

Thanks thomas - I’ll get onto this as soon as I can.

Ok, just taking a look at this…

[quote]When writing a multi threaded app, we need to initialise the context in the opengl worker thread, OpenGLContext::createContextForWindow() does the initialisation, but we need a way to use it in an OpenGLComponent after, it would be nice to have it as an optional parameter in the constructor for example.
Or maybe I missed something ?[/quote]

How about if I use the makeCurrentContextActive() method to create the context on-demand. That way, if the window is changed, it’ll delete the current context, but won’t re-create one until your paint thread tries to start drawing, and then it’ll get the newOpenGLContextCreated() callback on that thread, letting it do the initialisation?

And this swap interval stuff - what’s the “swapInt” parameter supposed to be? A bool? An interval in millisecs? There’s no units or explanation about how to use it…

You mean setPixelFormat() will not directly set the pixel format right ?
I think it should be ok.

it’s supposed to be an int, most of the time people want to use :
either 0 to benchmark the time it takes to draw a frame
or 1 to avoid artifacts in final build.
I guess most of the graphic cards don’t support any other value.

However from the official specification :

Yes, setPixelFormat() would just delete the current context, and it’d be re-created when it’s next needed.

So I guess “the number of frames to wait between swapping buffers” would be a good description.

It is right according to the spec if my understanding of english language doesn’t betray me.

ok, checked in now if you want to have a play. I’ve not actually tested the swap interval stuff yet though.

.The current tip fails to initialize an openGL pixel format on my machine because, like I wrote before, setting the pixel format without re-creating the window fails on the messy win32 implementation.
in juce_win32_Windowing.cpp OpenGLContext::createContextForWindow() creates a WindowedGLContext that creates the nativeWindow.
then calling SetPixelFormat again fails cause it’s forbidden to change the pixel format once it has been set. You’ll have to create a new window each time.

.and here is a small typo that I’ve seen at juce_win32_Windowing.cpp:3113 (WindowedGLContext constructor)
pfd.cColorBits = 16;
should be
pfd.cColorBits = 24;
it probably didn’t fail cause the driver will return a valid pixel format anyway.

…right, I’m confused now. To use the wglChoosePixelFormatARB, you need to already have a context, and to create a context you need to have first set a pixel format for the DC. But once you’ve set the pixel format, you’re not allowed to change it, so how can wglChoosePixelFormatARB possibly work??

yes here are the steps to create a windowed context with an ‘extended’ format :

create a Win32ComponentPeer from the component
create a simple context
make simple context current
query extentions
if WGL_ARB_pixel_format is found then
2)
find the right pixel format
delete simple context
delete simple Win32ComponentPeer
create another Win32ComponentPeer
Set the extended pixel format

To change the pixel format, you basically need to do the second part.

The important thing is that the windows are at the same position, this determines which graphic card is in charge of this window, so the extensions will always be the same as long as the user doesn’t change the display settings.
Also that’s why OpenGLPixelFormat::getAvailablePixelFormats() should have some bounds as parameter : the available pixelFormats depend on the ‘screen’ the window will be in.

Ok, I think I’ve got it going now. I’ve tried your demo and it seems to work - my changes are checked-in…

thanks a lot, one last thing is needed :

I need a way to get the currently active context, it should be simple to add something like
static OpenGLContext* OpenGLContext::getCurrentContext();

this was the original purpose of this thread : )
the mac and windows implementations are here: http://www.rawmaterialsoftware.com/juceforum/viewtopic.php?p=11158#11158

Yes, sorry - I’ve added that now.

Yeah ! finally I can merge this part with my code !

It’s enough new opengl features to post something about juce on opengl.org when you release next version…

Great! Glad we got there in the end! Thanks for all the suggestions.

I’m just updating to the new paradigm ; )

I started with moving my offscreen shared context over, but these two functions are pure virtual:

[code] virtual void updateWindowPosition (int x, int y, int w, int h, int outerWindowHeight) = 0;

virtual void repaint() = 0;[/code]

Could they not be please?

Bruce

So I still seem to be missing something. Can I get a sanity check please? I have an offscreen context, that now inherits from OpenGLContext. I make the context, and that’s hunky-dory. I re-implemented all the get and set context stuff, copying the Windowed context.

Now I need to create an OpenGLComponent, and tell it to share to this context - but I don’t see how to do it. It looks like it prefers another component. That would mean I’d need to implement a whole component also, just to hold the context I need?

Suggestions? I seem to be right back at having to modify juce itself, but in ways that I can’t readily do, such as replacing the createContextForWindow method.

Now we have contexts, can’t the components just share those? So in the constructor, instead of passing the component, they would pass the context?

Bruce :cry: