…openGL and other renderering engines is great, but I hope that we always keep the built-in software rendering engine (the awesome one that Jules wrote) in tip-top shape, so we have something that achieves these goals:
Is available in source code form and can be modified
Produces a consistent result regardless of platform
Is immune to proprietary changes in third party libraries (glares at Apple)
Don’t worry, the software renderer isn’t going away! On some older systems it’s probably going to be faster than openGL anyway - will be interesting to see how the benchmarks perform.
@Vinn: But at least you’ll be able to optimize it yourself now when it becomes exchangeable! For instance, for me only a couple of functions need to be faster or better. I don’t care about the rest.
I’ve got a problem: like I said, I wanted to exchange the font rendering. But the render context only has a function drawGlyph() that gets the number of the glyph (In the Glyph Cache I suppose?). How can I transfer this number into the character it represents?
Also, how do I retrieve the font?
In short: the glyph caching is done by my own font rendering system, which implements vertical hinting btw. I just want font name + font size + bold/italic/etc… + character, that’s all. I do not need JUCE’s glyph caching and it should be turned off since there’s no need for it anymore after I added my changes.
The character has already been converted to a glyph index by the layout engine - you certainly don’t want to convert it back again! You can pass that number to a Typeface object to get the glyph’s shape.
If you’re doing a custom renderer, then presumably it has stored the current font somewhere in there.
After the call to Graphics::drawText(), JUCE should not be involved anymore, I want my rendering system to do the rest. How do I achieve that? The whole system of glyph shapes etc… is already implemented using freetype in my code. I don’t need JUCE to do this.
I just noticed that there’s also a setFont() function in the LowLevelGraphicsContext class. Well that’s already very good, because then I know what font to use.
Still, the only problem that remains, is that drawGlyph() has a glyph number as parameter. I can’t do anything with that information. I need a character number, best would be an uint32 for UTF32 encoding. Is there any way to achieve this?
I think you might have got completely the wrong idea.
If you’re trying to do custom layout and glyph rendering, why not just do what TheVinn did a year or so ago, and write a custom typeface class? He didn’t need to touch the graphics context at all, and I already added some hooks to do what he needed.
In fact, from the sound of it, you might actually just be re-creating exactly what TheVinn already did!
Ok I think I understand how it works now that I found the setFont() function. Thanks! No I’m not recreating what he is, I’m making a LCD optimized font renderer.
You do realize that in my code, there is an #ifdef that, while turned off by default, not only creates the outlines but also provides Juce with the bitmap? You could just override that routine and turn the macro on.
No, not at all! A few cunning tweaks in the code that renders an edgetable should be all that’s required. The edgetable already contains a high horizontal pixel resolution, so is an ideal source for sub-pixel rendering. There’d be no need to change anything else in the font code to make it work, and it’d also mean that all paths would get rendered at sub-pixel resolution for free too.
IMHO the best font rendering I saw. Hinting is only applied vertically (and only on small fonts which look bad without vertical hinting), the rest is merely done via LCD optimized rendering, for which the code is actually very short: