JUCE OpenGL


#1

I am developing a rendering engine and looking for crossplatform, fast, stable and lightweight gui library.
My engine is GUI/window independent and wraps almost all OpenGL functions.
I have some questions.

  1. OpenGL log:
    Capability GL_BLEND was enabled!
    GL_TEXTURE_2D was enabled on texture unit GL_TEXTURE0.
    GL_BLEND_SRC is not GL_SRC_ALPHA!

  2. How I get can a correct number of FPS(frames per second) in JUCE?
    In the JUCE demo project(OpenGL demo) I tried:
    openGLContext.setSwapInterval(0);

in the “void renderOpenGL()”

[code]static DWORD LastFPSTime = GetTickCount(), LastFrameTime = LastFPSTime;
static int FPS = 0;

DWORD Time = GetTickCount();
LastFrameTime = Time;

if(Time - LastFPSTime > 1000)
{
infoLabel.setText(String(FPS), false);

LastFPSTime = Time;
FPS = 0;

}
else
{
FPS++;
}[/code]
65 FPS. I am trying to rotate the cube faster: 500 FPS.


#2

[quote]1. OpenGL log:
Capability GL_BLEND was enabled!
GL_TEXTURE_2D was enabled on texture unit GL_TEXTURE0.
GL_BLEND_SRC is not GL_SRC_ALPHA![/quote]

I don’t understand your question/point??

By default the rendering is throttled back to 60fps to stop it hammering the CPU, and the GL swapBuffers() call may also be blocking internally until the next frame-boundary. You could hack it to avoid those things if you need to do performance testing.


#3

[quote=“jules”][quote]1. OpenGL log:
Capability GL_BLEND was enabled!
GL_TEXTURE_2D was enabled on texture unit GL_TEXTURE0.
GL_BLEND_SRC is not GL_SRC_ALPHA![/quote]

I don’t understand your question/point??
[/quote]

Sorry for that. It wasn’t a JUCE problem.

By default the rendering is throttled back to 60fps to stop it hammering the CPU, and the GL swapBuffers() call may also be blocking internally until the next frame-boundary. You could hack it to avoid those things if you need to do performance testing.[/quote]

It is not correct. Due to this hack, setSwapInterval function is broken.
SwapInterval(1) == monitor refresh rate.
For example:
60Hz = 60 FPS
75Hz = 75 FPS
120Hz = 120 FPS.

In JUCE:
60Hz = 60 FPS
75Hz = 60 + x
120Hz = 60 + x

Here is my patch:

  • Removed hardcoded values.
  • Added functions: to turn off/pn FPS optimization
  • Added functions: to set/get default FPS value.
  • Fixed a problem with the swap interval
  • Documentation

#4

Cool, thanks! I needed to tinker with that stuff anyway for another project I’m doing, so will take a look asap…


#5

Ok, just looking into this a bit more deeply, I’m actually scratching my head wondering why I added the artificial delay in the first place… Can’t really think why anyone would need to throttle it back since AFAICT all the platforms (apart from android, which works differently anyway) can already delay until a frame boundary if you call setSwapInterval (1).

So unless anyone can think of a good reason, I think I’ll probably just lose the internal delay.


#6

There isn’t really a consistent way to block until the next interval, no. So what a trivial implementation will do is run the whole render pass again and again even if it isn’t being viewed.

On Mac, the correct way is to use a CVDisplayLink. On Linux I was using glXWaitVideoSyncSGI (until last week - it seems broken). Windows - don’t know.

There’s some theories that a glFinish will block until the next frame… but I see weird (complete) lock-ups on Linux using that approach.

Probably, the ‘proper’ way to do it is to use ARB Sync.

Bruce


#7

Is this normal, that OpenGLRenderer::openGLContextClosing() is called, when I maximizing/minimizing a JUCE main window?
I have around 20 framebuffer objects in my renering pipeline. So, I need to recreate them, when I maximizing/minimizing a JUCE main window.


#8

[quote=“AlricDoRei”]Is this normal, that OpenGLRenderer::openGLContextClosing() is called, when I maximizing/minimizing a JUCE main window?
I have around 20 framebuffer objects in my renering pipeline. So, I need to recreate them, when I maximizing/minimizing a JUCE main window.[/quote]

no, I wouldn’t expect that… Which OS? And where’s it being called from?


#9

[quote=“jules”][quote=“AlricDoRei”]Is this normal, that OpenGLRenderer::openGLContextClosing() is called, when I maximizing/minimizing a JUCE main window?
I have around 20 framebuffer objects in my renering pipeline. So, I need to recreate them, when I maximizing/minimizing a JUCE main window.[/quote]

no, I wouldn’t expect that… Which OS? And where’s it being called from?[/quote]

Visual testing:
Juce demo project(OpenGL demo)
OS: Windows 8 Pro 64 bit.
Try to minimize/restore window.

void newOpenGLContextCreated()
{

infoLabel.setText(infoLabel.getText()+ " created ", false);

}

void openGLContextClosing()
{

infoLabel.setText(infoLabel.getText()+ " closed ", false);

}


#10

Ah, you’re actually minimising the window… Well yes, when it closes/re-creates the desktop window, it will also close and re-start any GL contexts inside it.


#11

I think it is not good. Maybe I didn’t understand this concept but with this we need every time:

  1. Init OpenGL extensions( I’m using glew)

  2. Recompile all GLSL shader programs(I have around 70).

  3. Recreate Framebuffer Objects.

  4. And many others.
    Is it possible to pause this thread when window is minimized? I need to create OpenGL context when my App starts and close when my App is closing.

  5. I use Windows 8 Pro and got a crash. What is wrong with this code?
    To reproduce:
    I am using Juce version from(14.04.2013).
    Compile with Visual Studio 2012 in Debug. Press F5. Try to minimize/restore window.

Code:
.h file

[code]class MainContentComponent : public Component, public OpenGLRenderer
{
public:
//==============================================================================
MainContentComponent();
~MainContentComponent();

void paint (Graphics&);
void resized();

void newOpenGLContextCreated();
void renderOpenGL();
void openGLContextClosing();

private:
OpenGLContext context;
//==============================================================================
JUCE_DECLARE_NON_COPYABLE_WITH_LEAK_DETECTOR (MainContentComponent)
};[/code]

.cpp file

[code]MainContentComponent::MainContentComponent()
{
context.attachTo(*this);
context.setRenderer(this);

setSize (500, 400);

}

MainContentComponent::~MainContentComponent()
{
context.detach();
}

void MainContentComponent::paint (Graphics&) { }

void MainContentComponent::resized() { }

void MainContentComponent::newOpenGLContextCreated() { }

void MainContentComponent::renderOpenGL() { }

void MainContentComponent::openGLContextClosing() { }
[/code]

Crash:
File: juce_win32_Windowing.cpp
Line: 1369

static void* destroyWindowCallback (void* handle) { RevokeDragDrop ((HWND) handle); DestroyWindow ((HWND) handle); return nullptr; }


#12

Just trying this myself: the window doesn’t get deleted when minimised, so in fact this should be totally possible. Seems like the problem is that my GL driver (and presumably yours too) is crashing if you try to make certain types of GL call when the window is minimised. Amazing how flaky some of these drivers are, but I’ll see if I can add a workaround to avoid doing and drawing when it’s minimised.


#13

Ok, try again now…


#14

Tried the latest version. Everything works OK except:
I have the same crash when I try to close the window.


#15

Well if there’s a crash, then a stack trace would be helpful.


#16

Call stack:
Unhandled exeption 0x530F10E9(nvoglv32.dll).


#17

Odd. Do you still see that with the changes I made today?


#18

Yes.
Before your changes:

  1. Crash when minimize/restore window.
  2. Crash when closing window.

After your changes:

  1. Crash when closing window.

#19

Well, I don’t have any problems in the juce demo or any of my other GL apps, and nobody else has reported it, so I wonder what you’re doing that’s different? Have you tried the juce demo to see whether that crashes?


#20

The same problem with Juce Demo.

I think it is because you don’t have Nvidia GeForce 314.22 Driver installed.
On older versions, like 314.07, you can get some random crashes with no debug information avaliable.
On 310.90 - the same as for 314.07, but less often.
On Nvidia GeForce 314.22 Driver there is additional 100% crash(from my previous post).

Tested on 2 PCs.