OpenGL and JUCE wizardry


#1

Is anyone a wizard at OpenGL in JUCE and fancies writing a small module to display a component after blurring part of it? It’s turning into a time-sink figuring it out and I’d be happy to pay someone else to solve it!

Drop me a mail if this sounds easy to you and we may have some work for you once we’ve agreed our final design here!


High CPU usage on opengl repaints/renderFrame no matter what, even on unmodified default project
#2

The Component drawing pipeline isn’t really designed to cope with that scenario because when you’re calling paint you don’t have the option to switch shaders. …And proper blur is a multipass shader effect. You really need finely tuned control over which FBO and/or texture(s) you’re applying shaders on.

Basically you’re trying to use OpenGL like a game engine which JUCE has its architectural flaws that I mention here. The current OpenGL system draws everything into one texture after breaking it all down into scanlines using an EdgeTable.

You would have better luck getting what you want by writing your own OpenGL renderer (excluding JUCE’s logic) so you can attribute independent FBOs (and textures where appropriate) to Component objects, and this way you can apply shaders when needed (eg: to fill with a colour or gradient, to blur, etc…).


#3

I was imagining we’d just paint the component to an image then have a custom OpenGLRender implementation for actually chucking it (technical term) to the screen.


#4

Right now with JUCE you have only the one option: apply shaders on the entire top-level Component you attached your OpenGLContext to because a texture is created for the context to draw literally everything onto it.

You don’t have the flexibility you need to switch shaders to apply blur on a single child Component from within that GL context…


#5

You can tell it not to paint the component at all then deal with it by hand. I’m half way there … :slight_smile:


#6

How big is the component? I just found out about stack blur, which is about 5x faster than juce’s ImageConvolutionKernel::createGaussianBlur(). Doesn’t look quite as nice. Would that be fast enough to do it on the CPU?


#7

Yeah - I’ve got a few ways of tackling this i guess without resorting to OpenGL. But they involve a fair bit of faff. It’s actually a circular ring I need to blur right now at about 50Hz. So I could probably minimize the area of the screen I was blurring a lot and get some CPU back too - but it’s still pretty painful to watch the CPU meters while it runs on a smaller area
… compared to outsourcing it to the GPU :slight_smile:


#8

Precisely! Graphical effects are really intended for the GPU as it’s entirely designed for that.

I suggest playing with Shadertoy to get a better sense of what is actually possible with shaders… There is so much more to shaders than fills and blurs.


#9

All I need is a blur :wink:


#10

I’ve started again. All I’ve got now is a f8(!"£ing white triangle … and I’m proud of it. :slight_smile:


#11

Right, so using more OpenGL and obfuscating less of my code with the juce OpenGL objects, and sticking rigorously to OpenGL 3.3. I’m making progress. I’ve taken this picture, loaded it into a texture and i’m trying to stamp it on my triangle. But again, as before, its’ not the right size:

Is the lack of expansion because the texture is being allocated as a power-of-two square maybe?

I’m using OpenGLTexture for this … I might try it without so I can learn what’s going on here!


#12

Can confirm. Power-of-two image fills the triangle just fine.


#13

But just found this JUCE_OPENGL_ALLOW_NON_POWER_OF_TWO_TEXTURES … :slight_smile:


#14