Get glyph from typeface with unicode (UTF8 codepoint)

Hi Jule and Juce’s community,

I’m working on a Sheet Music Editor using the well-know “Bravura Font” (SMuFL) that is very good standard font for my purpose (font in OTF). The standard specify that each font normalized as the SMuFL (Standard Music Font Layout) provide some JSON files that describe the typeface glyphs and is codepoint in UTF8.

So, reading JSON and get the glyph code is not a problem. Anyway, the code is provide as text code like (U+xxxx where xxxx is the hexa code point of the glyph). The only way to get a normalized path for a specific glyph is the Typeface’s “getOutlineForGlyph” method of the Typeface class. The method use the index number of the glyph that is not the code point of the glyph but a specific sequential index for JUCE.

Fortunately, JUCE provide another method on the Typeface class : “getGlyphPositions” that allow to (indirectly) get the right sequential index of a glyph by using the “stringed” codepoint of the glyph.

However the method to get the “stringed codepoint” (convertion of the U+XXXX codepoint to a String version) need a new method that is, perhaps, not fully crossplatform ? Let me show you a little sample of the code I use to do the trick.

IntegerToUTF8 helper:

static inline std::string integerToUTF8 (int cp)
{
char c[5] = { 0x00, 0x00, 0x00, 0x00, 0x00 };

if (cp <= 0x7F) 
{
    c[0] = cp; 
}
else 
if (cp <= 0x7FF) 
{ 
    c[0] = (cp >> 6)  + 192; 
    c[1] = (cp  & 63) + 128; 
}
else 
if (0xd800 <= cp && cp <= 0xdfff) 
{
    // invalid block of utf8
}
else 
if (cp <= 0xFFFF) 
{
    c[0] =  (cp >> 12) + 224; 
    c[1] = ((cp >> 6)  & 63) + 128; 
    c[2] =  (cp  & 63) + 128; 
}
else 
if (cp <= 0x10FFFF) 
{ 
    c[0] =  (cp >> 18) + 240; 
    c[1] = ((cp >> 12) & 63) + 128;     
    c[2] = ((cp >> 6)  & 63) + 128; 
    c[3] =  (cp  & 63) + 128; 
}

return std::string(c);

}

static inline getPathFromTypeface(Typeface& typeface, Path& path, String unicodepoint)
{
Array<int> glyphNumbers;
Array<float> glyphOffsets;
String glyphUTF8 = integerToUTF8(unicodepoint.substring(2).getHexValue32());
typeface.getGlyphPositions(glyphUTF8, glyphNumbers, glyphOffsets);
typeface.getOutlineForGlyph(glyphNumbers[0], path);
}

Code : Get a glyph path from Unicode hex code point. Assume that the codepoint string is in U+XXXX form and no error check if the codepoint is valid.

That work for me on Windows, but i’m not sure if this trick is the best way to do that. Do not you think it would be interesting to have a method at the Typeface class that can do the job? The fact is that it lacks a method that allows to work directly from the original organization of glyphs (codepoint) I think.

Thank you very much for your answer.
Max

Use a CharPointer_UTF32 if you want to convert a code point to a String:

juce_wchar codePoint = 0x2603;
String s(CharPointer_UTF32(&codePoint), 1);


Roeland

You could also use String::charToString()