Exchanging the font & software render

Hi Jules, when do you plan to do the mods? I’m rather keen on testing this! Thanks!

I wish Jules could expose the LowLevelGraphicsSoftwareRenderer class declaration, and allow users to provide subclasses that override only certain parts. I know this ties his hands a bit when it comes to making changes but there could be a disclaimer.

One really good way to do it would be to move the class declaration into its own .cpp file that you manually #include (i.e. not obtained through juce.h).

This way no one is going to accidentally think that subclassing it is standard - it would require a deliberate effort to include a .cpp file (something few people do). And there would be plenty of disclaimers.

I think that exposing the LowLevelGraphicsSoftwareRenderer (and more importantly the associated LowLevelContext) with the understanding that Jules is free to change the API any time he wants, and totally break anyone’s customizations, is completely acceptible.

bump

FYI I’ve checked in a version where I’ve added a macro that should help with this. Give me a shout if you still need more control.

ah! excellent, I’ll let you know when I’ve done some tests!

Ok already have a first question. I’m trying to add a dummy renderer that inhertis from JUCE’s LowLevelGraphicsSoftwareRenderer. How do I properly include LowLevelGraphicsSoftwareRenderer.h ?
My header file (MySoftwareRenderer.h) looks like this at the moment:

[code]#pragma once
#include “JuceHeader.h”

#include “…/…/…/modules/juce_graphics/contexts/juce_LowLevelGraphicsSoftwareRenderer.h”

class MySoftwareRenderer:public LowLevelGraphicsSoftwareRenderer
{
public:
MySoftwareRenderer (const Image& image_);
MySoftwareRenderer (const Image& imageToRenderOn, int xOffset, int yOffset, const RectangleList& initialClip);
};[/code]

In AppConfig.h I added these 2 lines:

#define JUCE_DEFAULT_SOFTWARE_RENDERER_CLASS MySoftwareRenderer #include "../source/MySoftwareRenderer.h"

I get a lot of errors in WidgetsDemo.cpp, starting like this:

1>Compiling... 1>WidgetsDemo.cpp 1>d:\juce154\juce\modules\juce_graphics\colour\juce_pixelformats.h(55) : error C2470: 'PixelARGB' : looks like a function definition, but there is no parameter list; skipping apparent body 1>d:\juce154\juce\modules\juce_graphics\colour\juce_pixelformats.h(289) : error C2470: 'PixelRGB' : looks like a function definition, but there is no parameter list; skipping apparent body 1>d:\juce154\juce\modules\juce_graphics\colour\juce_pixelformats.h(443) : error C2144: syntax error : 'void' should be preceded by ';' 1>d:\juce154\juce\modules\juce_graphics\colour\juce_pixelformats.h(443) : error C4430: missing type specifier - int assumed. Note: C++ does not support default-int

The errors only show up when including juce_LowLevelGraphicsSoftwareRenderer.h, but I must somehow include it for the class declaration of LowLeveGraphicsSoftwareRenderer.

Good point, I don’t think you’d be able to include the juce headers before your own code, it’d create a lot of circular dependencies… Am open to better suggestions.

As TheVinn suggested, it would make sense to have one single LowLevelGraphicsSoftwareRenderer created and then be used by all JUCE functions. This would be internally represented by a pointer that can get exchanged by another LowLevelGraphicsSoftwareRenderer sub-class.
For me, as JUCE-user, the most logical way would be to have a (static) function called exchangeLowLevelGraphicsSoftwareRenderer, where you just pass a pointer with your own LowLevelGraphicsSoftwareRenderer and from then on, JUCE will internally use that pointer instead. Passing 0 would make the original JUCE renderer be used again. I don’t know what class this function should be part of though (since also drawing images “offline” should use the new renderer, I don’t think the place to specify it is in ComponentPeer).

No, I don’t think he would have been suggesting that - it’d be impossible to have a shared instance.

I guess that the only way to do it is to have a factory base class that can create the right type of renderer, and a static method to register your new factory as the default producer of them. Unfortunately that means they all have to be heap-allocated, which is a bit inelegant, but not too big an overhead. Bit of a faff to write it all though, I was hoping to do this with minimal impact.

Alright, please let me know when you’ve implemented this. I can’t wait to try it out.

Yes, this is exactly what I had in mind. There would be a virtual function on the ComponentPeer that would act as the “Factory” to return a heap allocated renderer, and the default implementation would just return what it is making now.

Further refinements would virtualize the drawing so that the factory could be to create a graphics context for each CPU core and divide the area to be redrawn into strips, and render them in parallel on multiple CPUs.

In fact, it wouldn’t be enough to add a method to ComponentPeer, because there are places where you’re not creating the peer yourself, or where there isn’t one (e.g. rendering to an image).

Since I’m playing with openGL rendering at the moment too, which has similar problems, I’ll try to figure out a clean solution that covers all these things.

Does that mean we’ll get cool animation effects using OpenGL etc…? I’d dream of that! We’re 2011 and JUCE really would kick a** with something like that.

Well, the idea is that you’ll be able to render into an opengl framebuffer using a normal Graphics context, and mix up normal openGL commands with the 2D stuff, so yes, you’d be able to render your component into a framebuffer and then whizz it around in 3D if you wanted to!

This is my F5-thread!

Would that be a pure OGL renderer or does it need to mix in software rendering?
I think pure OGL would certainly be fastest as it could allow some batching optimizations.

Since it seems you nailed Paths in pure OGL i think that was the hardest part, in terms of antialising problems?
For smooth lines, i would think it works to construct them at a high resolution, with transparent vertices on the edges and supersample at a renderbuffer - maybe thats what you already did to paths.

AFAICT it should all be possible without software rendering.

The bit that had me puzzled was anti-aliased polygons, but I got that working by rendering a jittered set of triangles into just the alpha channel of a framebuffer to create a mask, and then I’ll render the various gradient fill patterns into just its RGB channels, after which it can all get copied to the target. It’s a lot of triangles and data moving, but that’s what GL is good at, so hopefully it’ll work out to be pretty fast!

Sounds absolutely awesome! When do you think you’ll have finished coding on it? I assume it will still be possible for me to do my own font-rendering, even with OpenGL (via exchanging the software renderer by my own one) ?

Am working on it at the moment - there’s quite a lot of work involved, but am really enjoying it, so it’ll probably get done fairly soon!

As far as font rendering goes… yes, it probably won’t make much difference there.

Hi Jules, if I didn’t misread your description, it seems to me that you are achieving anti-aliasing manually by rendering elements more than once and shifting them around?

If so, perhaps this is a useful piece of knowledge: relatively recently (the last few years) OpenGL also allows rendering into multisampled framebuffer objects, so the graphics card does the AA for you. I wrote a contribution to the Processing GLGraphics library doing this two years ago. Now I don’t remember what I did exactly, but briefly looking at the code I see “GL_TEXTURE_RECTANGLE_ARB” and “glRenderbufferStorageMultisampleEXT()” having something to do with it.

Let me know if this seems interesting/relevant to you and I can dig out some code examples, or point you in the right direction.

Yeah, I was hoping multisampling would be the answer, but unfortunately

a) it looks rubbish! 4 or even 8 levels of grey isn’t enough!
b) it’s not available on all systems
c) it uses more memory
d) I need to do it in a framebuffer, and to only set the alpha channel, so I can then fill the RGB channel with a gradient or other fill pattern

The system I’ve figured out is quite neat, and works for all hardware - in the future there’ll probably be better ways using fragment shaders and pixel coverage, but I’m only just getting started with openGL so not going to worry about that yet.