How do I draw text over an openglcomponent?

Hello ‘Jucers’,

as by subject :slight_smile: anyone can help ?

thanks in advance

The only way - I think - is to paint over it just after swapBuffers()

but I’m far from sure it will work

You will get much better performance by doing your drawing in openGL

or implement a LowLevelGraphicsSoftwareRenderer using openGL,
that would be very nice : ))

You’ll have trouble drawing over an openglcomp with juce graphics commands because it has to use its own opaque window.

The easiest way I can think of would be to draw your text into a juce Image, then turn that into a texture and draw it with opengl.

Thanks guys, in your opinion drawing into an image would be fast enough to handle hundreds of words (yep Ive many…) ? maybe I’ll just use FreeType+FTGL or OGLFreeType but this sounds as a lot of work

Drawing into an image is very fast - same as drawing text anywhere else in juce. And of course there’s no need to resort to freetype, because if you want to use vector-based glyphs, juce already does that, on all platforms. See the GlyphArrangement class…

Something that would be really useful would be if you could add proper outline font support to the OpenGLComponent, so we could just call, say, setFont() and start drawing 3d text using JUCE’ font class. As with most things opengl, there’s Windows code at Nehe:

Also, while I’m making requests, another cool thing to have would be hardware antialiasing. I’ve already implemented this myself on Windows and OSX for my VSTGL framework, you can see the code here (v1.5).

  • Niall.

A better solution will be a special Juce graphics context that draws onto an openGL context… This is part of the reason I refactored the graphics context classes recently.

The only problem with openGL is that the polygons are no good - antialiasing seems to be a big problem unless you’re using an accumulation buffer.

Hmm… A graphics context where you could freely mix JUCE-style 2d drawing and opengl 3d drawing would indeed be very cool 8).

Well, it’s not generally as good as decent 2d antialiasing, but I don’t think it’s that bad. See:

That’s 4x antialiasing, if I remember correctly (and most recent cards support 6x). The only problem is that some older cards don’t support the antialiasing extension, but I think those are few and far between now.

  • Niall.

Trouble is that to anti-alias with an accumulation buffer, you have to redraw the whole thing repeatedly, which isn’t exactly optimal - you wouldn’t want your paint() method to get called 16 times.

I’m not an opengl expert though - anyone know a trick for drawing good polygons? Maybe there’s a way of just using it to draw each scanline?

Here you can see which cards support multisample on windows.

Nvidia has it since GeForce 3 (out end of 2001)
ATI since Radeon 9600 (out middle of 2003)
It looks like Intel integrated crap still doesn’t support it

This is antialiasing done in hardware, today’s cards do 8x antialiasing or
even 16x antialiasing with SLI.

This is fullscreen antialiasing, even if you should be enable to enable or diable it during the rendering.

There is another technique called supersampling, but I don’t know so much about it. it looks like you’ll need shaders.

Here is an example of it :

this is also interesting :

No, you’ve got the wrong idea - you don’t need an accumulation buffer to do anti-aliasing, as there’s an opengl extension to do it via hardware. Take a look at the code for VSTGL to see how it’s done (or NeHe, as usual:

  • Niall.

ah, I see…

some other information that are interresting for this subject :

also you should take a look at amanith :

Cheers. I downloaded Glitz recently to have a go, but it’s incomprehensible! Would probably be quicker just to write it all myself than to figure out how it’s supposed to work!

I hadn’t heard of Amanith, though, which looks interesting, so I’ll investigate that.

i have a semi framework i developed on windows that uses freetype to write to text on opengl. the way i did it was to use freetype to rasterize all characters to a single bitmap, then i used texturemapping to get them to the opengl framebuffer. presumably one could do the same with juce, tho i’m still barely dipping my toes in juce at the moment.

while it would be great for juce to do everything in opengl, it shouldn’t be too hard to render the opengl stuff to a frame buffer object instead of the screen (thru the use of opengl extensions). i believe you’d then have access to the framebuffer for juce to render to as well. juce could keep rendering the stuff it’s good at and opengl could render the stuff it’s good at.

Well if you look at the opengl demo code, there’s a bit where it loads a texture from a juce Image object - so if you changed this bit:

[code] DemoOpenGLCanvas()
rotation = 0.0f;
delta = 1.0f;

    Image* im = ImageFileFormat::loadFrom (BinaryData::juce_png, BinaryData::juce_pngSize);
    image = im->createCopy (512, 512);
    delete im;

    startTimer (20);

to something like:

[code] DemoOpenGLCanvas()
rotation = 0.0f;
delta = 1.0f;

    image = new Image (Image::ARGB, 512, 512, true);
    Graphics g (*im);
    g.setFont (80.0f);
    g.drawText (T("Hello World"), 0, 0, 512, 512, Justification::centred, true);

    startTimer (20);

…then it’d put some text on the spinning cube instead.

I used this method to draw my strings and worked very well. Needed to cache the textures because creating many gl texture from scratch every frame was too slow. Also I found a little problem. When I created an RGB Image and cleaned it to blue, the texture created through opengl (stil RGB) was red (or viceversa, cant remember atm). Still wondering why.

pseudo code:

Image *pImgLogo = new Image(Image::RGB, 256, 16, true);
Graphics g(*pImgLogo);
pImgLogo->clear(0, 0, 256, 16, Colour(255, 0, 0));
g.drawText("the sky is red", 0, 0, 256, 16, Justification::centredLeft, true);

GLuint id;
glGenTextures(1, &id);
glBindTexture(GL_TEXTURE_2D, id);			

const uint8 *data = pImgLogo->lockPixelDataReadOnly(0, 0, 256, 16, lineStride, pixelStride);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, pImgLogo->getWidth(), pImgLogo->getHeight(), 0, GL_RGB, GL_UNSIGNED_BYTE, data);

then once I draw the texture is blue not red (using flat shading and no lighting). Strange isnt it?

By the way thanks guys for all your answers :slight_smile:

Juce seems to use BGR for its internal pixel format, so try using GL_BGR_EXT instead of GL_RGB.