Questions about juce w/regards to building 3d apps


#1

Hi all,

I’m radiance, the founder of http://www.luxrender.net :slight_smile:

I’m considering using the juce toolkit for an OSS 3d app.
I’m completely new to it, been reading up and playing with the demos etc, and have a few questions:

  • what backend does the juice UI drawing uses ? is it opengl ?
  • i’ll be making heavy use of opengl 3d viewports, and i’ll be needing lots of performance, are there any limitations with opengl while using juce ? (eg, adding/using glut/glee/vertex-fragment-geometry shaders, etc…)
  • if i use the opengl component (as in the juce demo), is there any performance loss compared to a traditional gl/glut double buffered raw window ? or can i use juce for UI management and create my own opengl raw opengl viewports, similar to building one intop of a wxwidgets frame ?

thanks for your time :slight_smile:

Radiance


#2

hello! Looks like a nice project you’ve got there, would love to see it running on juce!

The normal 2D UI rendering is all done in software, which was historically the best/only way to do decent rendering. Nowadays, with the new Vista rendering stuff, and CoreGraphics performance looking good, I might change that.

For opengl, there’s a component you can embed that gives you a native gl surface to play with - best ask other people to comment on the performance of this, as I don’t have much first-hand experience of using it in my own projects, but I don’t think there’s any reason why it’d be slower than opengl in any other kind of app.


#3

Yes, you can intermix juce and OpenGL quite well. You might find at some point that you hit a limitation to the ‘OpenGLComponent and Timer’ approach that is most like Glut, but there’s ways around it, and getting juce’s other classes and basic app framework is invaluable.

[topic Hijack]

Jules, could I please try to persuade you to skip OS specific graphics systems and go right to OpenGL as a backend? Most drawing code - like a graphics object - could presumably be moved right to openGL, and the rest could be supplemented with offscreen moved to textures - in fact you’ve mentioned before that this is possible if difficult.

There’s a few other libraries that do this, and it seems to be winning them fans, such as Qt and NUI.

Some of the advantages:
Full 3d component animation - becoming a staple of modern UIs, it seems,
Removes some restrictions with mixing 2d and 3d, especially on Linux apps (tearing),
increased frame rates,
easier mixing with other 3d based libraries,
More scaleable across large screen sizes,
Would probably make iPhone and Android etc (Palm?) (and future platforms) apps possible without much extra effort,
Would run full quality GUIs on tiny systems (ION, Tegra)

And, like, please please.

Bruce


#4

I investigated this a couple of years ago, but found terrible problems in trying to use it for rendering anti-alised polygons in a consistent way, and I gave up. I also spend a lot of time using VMs these days, and that’s an area where using a gl renderer would actually make it slower. But I do agree that it’d be a really neat solution if it’s technically possible…


#5

[quote=“jules”]hello! Looks like a nice project you’ve got there, would love to see it running on juce!

The normal 2D UI rendering is all done in software, which was historically the best/only way to do decent rendering. Nowadays, with the new Vista rendering stuff, and CoreGraphics performance looking good, I might change that.

For opengl, there’s a component you can embed that gives you a native gl surface to play with - best ask other people to comment on the performance of this, as I don’t have much first-hand experience of using it in my own projects, but I don’t think there’s any reason why it’d be slower than opengl in any other kind of app.[/quote]

when i run the demo (the one on the download page right now), for windows, i don’t seem to get a solid framerate (eg a decent 60fps), that’s why i asked. I was wondering if there was some render -> texture -> juice or equiv going on. :slight_smile:

Radiance


#6

Oh, to answer the question as why the Juce OpenGLComponent - as it comes - is a bit slower than some other implementations is (AFAICT) that it’s locked to the main thread, which, in an event based program, is doing a lot of other work.

On a decent machine, the OpenGLComponent decoupled from the main thread is the same speed as anything else, with the obvious limitations that come with splitting different parts of a UI into different threads.

Bruce


#7

We use Juce in our VJ Software, and it works like a charm.
We use threads to render our OpenGL stuff, OpenGL is not thread-safe so you’ll have to take care of that.

http://www.resolume.com


#8

Hi,

I’ve been playing around with juce, and things seem to work ok,
i’ve noticed there’s a small delay every second or so, from the events, which i resolved by rendering from a separate opengl thread.

I agree with bruce wheaton about using opengl for backend drawing, Jules, maybe you can think about creating a new prototype LowLevelGraphicsOpenGLRenderer ?

Judging from the methods of the classLowLevelGraphicsSoftwareRenderer it should’nt be too much work :slight_smile:

Radiance


#9

Well, if any of you opengl boffins can tell me how to make it reliably render an anti-aliased polygon, in a way that’ll work the same on all platforms, then I’ll certainly consider it! That was the showstoppper when I last looked into it.


#10

Render it to an BGRA buffer and use that as a texture.

The other answer is lots of subdivision, or clever shader tricks, maybe hardware multi-sample anti-aliasing. But since you have offscreen already in, just drawing into each offscreen and compositing them together in OpenGL will get us 90% there a lot quicker.

Animation of the component could be just OpenGL manipulations, with the actual pixels being ‘native’ at the target resolution.

Bruce


#11

there are many ways to do this, some more portable than other.

for lines/polies you can use GL_LINE_SMOOTH.

otherwise, you could leave the antialiasing up the the user of the juice library, i think it makes sense to have users get control over this, not everyone wants antialiasing.

otherwise, write some code that detects opengl extensions, and enable multi/coverage sampling where available.

another option would be to use the opengl accumulation buffer.

there’s other more complex and speedy ways, using shaders as here:
http://people.csail.mit.edu/ericchan/articles/prefilter/

there’s a plethora of options, and i think it would make sense to just render without AA, and include a bunch of options to let the user decide which AA to use.

eg, someone who wants to develop a 2D vector illustration app will want nice smooth AA (4x multisampling + 8-16x additional coverage samples), while someone like me who wants to use it as a replacement for wxwidgets wants cheap AA and maybe even no AA at all :slight_smile:

Radiance

EDIT: it might be handy to have control on a component level so we can choose different types of AA or none at all on different components.


#12

Yeah, none of those ideas work, I’m afraid.

GL_LINE_SMOOTH would be great, but it doesn’t work. It leaves gaps all over the place when it triangulates, at least on some graphics cards, and isn’t supported on others.

The link about the prefiltering idea is horrific! The whole point of AA is to render things with sub-pixel accuracy - just vaguely blurring everything is massively missing the point!

An accumulation buffer would be ridiculously slow, because to get a decent result you’d need to do at least 16x oversampling, preferably 64x (and to get a result as good as a decent software renderer, you’d have to go to 256x), so wouldn’t that involve calling paint() 16 times, or at least wasting time re-rendering all the rest of the stuff!?

Ideally, now that programmable shaders are becoming common, it might be possible to use that to do it, though it’d be seriously complicated to do! In the meantime, the only real option is to do what Bruce suggested, and fall back to a software render, which will probably end up making everything slower than a pure software solution.

Is AA really putting a heavy load on your CPU?? Maybe it’s time to upgrade from your 80486!


#13

well,

i think the best solution would be to have the AA be configurable.
g.SetAAMode(NVIDIA, CSAA, etc…)

then add support for a few implementations by testing for and enabling the relevant ATI and NVIDIA extensions.

here’s the nvidia version:

Coverage Sampling Anti-Aliasing (CSAA) produces antialiased images that rival the quality of 8x or 16x MSAA, while introducing only a minimal performance hit over standard (typically 4x) MSAA. It works by introducing the concept of a new sample type: a sample that represents coverage. This differs from previous AA techniques where coverage was always inherently tied to another sample type. CSAA reduces bandwidth and storage costs. Implementing CSAA on the new GeForce 8 series is done through a new NVIDIA OpenGL API extension: GL_NV_framebuffer_multisample_coverage.

there’s heaps other AA types…

remember that you have a community to test on various platforms/gpus.

i think having control over AA behaviour on a component level is a must, and i think the whole AA issue is peanuts against the speed benefits of using an opengl backend. :slight_smile:

Radiance


#14

[quote]i think the best solution would be to have the AA be configurable.
g.SetAAMode(NVIDIA, CSAA, etc…) [/quote]

Ugh… over my dead body! The Graphics class is NOT the place for anything platform-specific!

I’m all in favour of using acceleration, but not at the expense of quality or usability. If openGL can speed things up, that’s great, but it’d need to be completely invisible to the user, with no unexpected side effects or settings to mess about with.


#15

I use JUCE paths for my medical imaging class. They form the basis of various closed contour tools. I use them precisely because their behaviour is predictable and consistent across platforms.

Anything that makes the anti-aliasing behaviour of paths less consistent will cause me now end of grief. Essentially I’d be forced to implement the Path classes from scratch. :frowning:


#16

[quote=“valley”]I use JUCE paths for my medical imaging class. They form the basis of various closed contour tools. I use them precisely because their behaviour is predictable and consistent across platforms.

Anything that makes the anti-aliasing behaviour of paths less consistent will cause me now end of grief. Essentially I’d be forced to implement the Path classes from scratch. :([/quote]

if it’s configurable you can use a mode that’s identical on all platforms.

anyways, i think i might start coding a gl renderer myself as i could use one, might take some time, i’ll post here if i’ve any news.

Radiance


#17

My long-term view on this is that options for anything other than “perfect” rendering quality will gradually disappear from libraries and OSes before too long - and good riddance!


#18

I realize this is a really old thread, but I haven't found an answer to this other places and my issue was explicitly asked here:

Can we use an OpenGL backend to draw all our custom and library JUCE components? Is there any plan to add this functionality in the future? 

Thanks for guiding the newbies like me!

Cheers,

Dave


#19

Yes, of course - the GL renderer has been there for a couple of years now. The demo app shows how it's used to mix up 3D and 2D stuff.